Poor man's remote backup/replication strategy advice needed

Hi all!
I am running a weather data collection app that is continuously storing weather measurements from multiple BLE sensors in an InfluxDB on a Raspberry Pi in my hut in the country. I would like to replicate the data to an identical InfluxDB running in the Amazon cloud to 1) have it more accessible from the outside world and 2) have a remote backup copy just in case. The remote copy does not have to be up-to-the minute accurate, a few hours delay will be tolerated. The link from the Pi to the internet is fast and sorta reliable but blackouts do happen every now and then.
What I am having in mind is running a script from cron on the Pi every N hours that would:

  1. Backup the local database -since last_successful_replication_time
  2. Restore the remote database from the backup just made
    Now the question is how do I reliably determine the “last_successful_replication_time”. The most reliable way would be to query the remote database for the very last measurement it holds but how can I do it from command line and receive the timestamp I could then use in the backup command?

Also, if anybody has any comments on why I shouldn’t be doing it that way at all but should be doing it a totally other way - I will appreciate those! :wink: I am totally new to InfluxDB and may not be taking a lot of things into account.
PS One more important question: do I get it right that when I restore a partial (made with -since) backup to an existing database, it won’t wipe the database of the pre-existing data but will just add the backup contents to whatever already was in the database? If not then I guess all my strategy goes to pot…

2 Likes