I am trying to figure out the best way to ingest lots of values in influxdb.
Our interval meter data is being generated (1 GB) on a daily basis. Currently, they are generating many csv files daily. We have decided to use InfluxDB as it is a TSDB to store our data and use chronograf to visualize it. My question here is that how we can ingest our data into InfluxDB? does that make sense to ingest flat files(CSV) directly into IDB? if so how we can do that?
The data are being generated from 1000+ meter devices in form of 100k csv files coming from FTP which now takes 7 hours for 1GB but we might face 60 GB files in future . So as the number of files is very big we have to have a parallel ingestion.
You can certainly do it all in the script, too. But I let Telegraf take care of the period running and all the talking to InfluxDB, plus Telegraf has extra tagging and routing features.
I want to use an FTP server as input. My goal is therefore that Telegraph collects all CSV data from my FTP server and then transmits it to InfluxdB. How can I do that ?
You will likely have to write a script that monitors the incoming ftp directory for new files and then sends them to InfluxDB/Telegraf. AFAIK there is no direct ftp–>telegraf plugin.