The way to write ten thousand data to InfluxDB per second?

Hello, I’m using a raspberry pi 4 to collect sensor data (float) by a python script .(from an adc)
Which can read ten thousand data per second.
And now I wanna save these data to influx for real-time analysis. The system need so much data.
I have tried to save them to a log file while reading, and use a telegraf to collect as the blog did:https://www.iotforall.com/iot-dashboard
But it’s not working for my such a big stream data.
So how can I write these data to in influxdb in time. Are there any solutions?
Thankyou much appreciated :slight_smile:

Hello @Yoshiko,
Welcome!
What errors are you getting? Can you please set debug=true on your telegraf config and share your logs?
Additionally you might be interested in: Write Millions of Points From CSV to InfluxDB with the 2.0 Python Client | InfluxData

If you’re running influxdb on the raspberry pi, (or any server really) it’s worth getting an ssd storage , they are better value per GB than usb sticks and better performance too.

Edit: please share your code so far, use the preformatted code button too </>