InfluxDB mirrorring, syncing, copying - how?

Hello everyone,

Let’s say:
I have an influxdb instance collecting data in a remote location with intermittent internet connection.
I have another instance in the headquarters with higher availability.
I want these two databases synced all the time.

I know that enterprise edition has this feature, but is there any way to accomplish this with the open source edition?

Best regards,
dersanli

@dersanli

I would use telegraf's http_listener to ship the metrics over the open internet to the Cloud InfluxDB.

To split your writes there are a couple of different methods that could work. One is to use influxdb-relay to write to both influxdb and telegraf. Another is to use influxdb's SUBSCRIPTION mechanism to send all data coming into the instance to your telegraf instance.

Do either of these options help?

@dersanli

I have a similar use-case. I was thinking of polling the remote influx servers via http, saving the ‘last seen’ timestamp on the central server(s), pulling down the latest points every poll.

Some potential nice aspects to this approach 1) tolerates disconnects and network outages 2) minimal configuration on the remote influx system (https and basic-auth) 3) don’t need to touch influxdb-relay, telegraf or SUBSCRIPTION 4) can distribute to multiple servers 5) no need for VPN, tunneling, etc.

WDYT? Are there any downsides??

I’d love to see this implemented as I’m looking at similar solutions and really don’t want to have to implement influxrelay

It’s on my todo list - I will write this as a ruby script that can be invoked from the command line or from cron. Anticipate on the order of 150 lines of code - will post the script here when it is working.