Copy time series data from an ODBC compliant database to InfluxDB

Can I use an ODBC connector and poll data from another database into InfluxDB? The clients database is extremely old but I can read values using ODBC. I want to poll it every 10 minutes and read and store about 1000 values coming in from a SCADA system.

Not directly but you could use the exec plugin. With this you would write a script(ruby,python etc) that can grab this data and return it to Telegraf. From there it could be stored in Influx.

Excellent idea. I also see it suports CSV files so I suppose a clumsy way forward would be to trigger an Excel file to launch, collect the data using ODBC, save it as a CSV file, then launch an input reader to collect the CSV file and process it. I could do this once per day.
I couldn’t find documentation on installing the plugins on my Windows server (everything seems to assume you are running under Linux) nor any examples on how you configure an input function to run at all (i…e if you could point me to where it says how to set up the timing, add the plugin etc) that would be great.
Does the CSV plug in delete the file when it is finished with it or do I need to run a Scheduler task in Windows to do the :

  • Excel launch,

  • (with an Excel scripted CSV save),

  • schedule the CSV import

  • delete the CSV file


So something to remember is that Telegraf will run every 10 seconds(by default).

You don’t really schedule an input plugin inside Telegraf, it just runs at the set interval. The other option instead of using telegraf is to write a script and run it either as a service or in cron. We do both(scripts and telegraf exec) depending on how often we need to collect the data.

You could create a crontab line to run a script, which queries ODBC, then makes a write POST http request against InfluxDB. It should follow the InfluxDB Line Protocol.

I wrote a simple Python script that opens the ODBC connection, grabs my data, creates a CSV file and writes to the file then shuts down. All works perfectly.
The format is Tag_ID, TimeStamp,Value
So I have multiple rows with timestamps for every device ID, some have only a few entries, some have 144 entries for the day
Now I just need to figure out how to read the data into InfluxDB, so would you recommend I carry on in Python and run this script once per day, or schedule something using Telegraf to read the created CSV file. Can telegraf delete the file after a successful read?. I wouldn’t want it do read the file every 10 seconds!

If you only need to do this every 24 hours, I would probably skip Telegraf altogether. Run your python script to build/collect the data(skip the CSV completely) and store it as an array.

Then create a second part of your script that just writes it to influxdb directly

Docs for doing so
Writing Data
InfluxDB Line Protocol
If you wanted to save time, you could use a Python library