Migrate data from SQL Server to Influxdb

Hello to you all.

I am quite new to Influxdb, so maybe my question sounds a bit foolish but, How can I migrete data from a SQL Server database to Influxdb? I’m trying to find some hints from over here but I cannot see anything.

My client has all its data from PLCs stored in a SQL Server database, and now we want to move to a more efficient solution. So we need to move all SQL Server data to Influxdb. We are thinking in Telegraf or Flux Tasks to do so, but we cannot find any documentation about how to do it.

Any suggestion? Thanks a lot!!

1 Like

Hello @rudoce,
You can use the sql.from() function in a task:

But task are better suited towards continuous transformation work rather than a bulk migration.
Same with many Telegraf input plugins (they’re used to continuously read and write data)

To perform a bulk migration from SQL, you might be better off writing your data to a file or a series of files depending on how much data you have. Then I’d use the telegraf file plugin:

Assuming you can write your data into any of the following formats:

Please let me know if that answers your question.

1 Like

Thanks a lot for your quick response, Anaisdg. I will have a look over that information.

You are right: we will need to perform a bulk migration… in the first place, because actually our client is writing new data from the PLCs to that SQL Server database every 15 min, and deleting old data at the same time, preserving always a whole year of data. He is not willing to remove the SQL Server database from its infraestructure, but wants a database able to handle the data more efficiently, thats the reason to get to Influxdb.
So every 15 min new data is ingested in the database and old data data is deleted. And that data must be ingested also (every 15 min) by Influxdb.

So we will need to do also continuous migration from SQL Server.

Hello @rudoce,
Great! In that case I encourage you to look into the SQL flux package

Thank you very much, Anaisdg.

Following your advise, I will start with a bulk migration using the sql.from() Flux function (I will not use Telegraf in order to add no more components into the architecture), and later I will use a Flux task for continuous querying on new data.

I guess the challenge will be to identify new data in the SQL Server database using Flux, since I understand Flux is not very effective for lookups or matching tasks with other databases… And I don’t want to lose or duplicate data.

Any suggestion for doing this? I cannot see tasks examples in Internet, aside from those Influxdata show in their documentation…

Thanks a lot!!

How are you writing data to your SQL server? Could you write directly from the source?
I don’t know if you want to perform a bulk migration with sql.from() it depends on the amount of data you’re trying to migrate. If I had a lot of data I would write my data into a series of json, csv, or line protocol txt files and use the telegraf file plugin to write the data to InfluxDB.

1 Like

Yes, somehow my client writes from PLCs to a SQL Server in batch mode each 15 min, and is a requerimient that we must consider that database as our source of data.

So OK, I will do as you say.

Thanks a lot for everything.

1 Like

@rudoce sure no problem let me know if you need more help.

@rudoce I have a similar requirement… will be helpful if you can share what approach you had taken.