Data ingestion - CSV files

I am trying to figure out the best way to ingest lots of values in influxdb.

Our interval meter data is being generated (1 GB) on a daily basis. Currently, they are generating many csv files daily. We have decided to use InfluxDB as it is a TSDB to store our data and use chronograf to visualize it. My question here is that how we can ingest our data into InfluxDB? does that make sense to ingest flat files(CSV) directly into IDB? if so how we can do that?

Hi @Amin_Mohebi,

Welcome to InfluxDB! How is the data being generated and pushed to the CSV files? What is generating the data? Is it possible to push it either straight into InfluxDB via line protocol, or into Telegraf using one of the Telegraf plugins? Right now there is no direct CSV importer (see issue here: ) so gathering the data straight from the source is likely the easiest.

Best regards,

For stuff like this I’ve had Telegraf run a Python script that takes in the data, converts it to line protocol, and outputs to stdoud.

The data are being generated from 1000+ meter devices in form of 100k csv files coming from FTP which now takes 7 hours for 1GB but we might face 60 GB files in future . So as the number of files is very big we have to have a parallel ingestion.

What is the benefit of running Python script in Telegraf? why not just running the script alone with no Telegraf involvement?

You can certainly do it all in the script, too. But I let Telegraf take care of the period running and all the talking to InfluxDB, plus Telegraf has extra tagging and routing features.



I want to use an FTP server as input. My goal is therefore that Telegraph collects all CSV data from my FTP server and then transmits it to InfluxdB. How can I do that ?

Please help me.

Best Regards,


You will likely have to write a script that monitors the incoming ftp directory for new files and then sends them to InfluxDB/Telegraf. AFAIK there is no direct ftp–>telegraf plugin.