Copy data from one database to another, as it is written (kapacitor?)


I am using a staging db to write data to influxdb. In the staging db it is written data which might also have time back in the past. I need to provide access to data (at database level) based on some criteria (e.g. data tagged with server=‘xyz’, can be accessed by user ‘user_xyz’). What is best way to achieve this?

One idea is to copy data from staging db to different final dbs and assign specific access to each final db. How can I copy data using e.g. Kapacitor (or continuous queries) also ensuring past data (data pushed now with a time of 1 year back) is also copied?

I found a way to do it with Kapacitor, but it doesn’t work in case I write old data:

    |query('SELECT mean(value) FROM "db_staging"."autogen"."temp"')
        .groupBy(time(100ms), *)


kapacitor define move_data -type batch -tick b_test_back_data.tick -dbrp db_staging.autogen
kapacitor enable move_data

Is it there a way or triggering Kapacitor also on old data writing? Or…



You don’t need to copy data to a new database to control access. InfluxEnteprise supports fine-grain access to the data at the measurements and series level. Here is a link to our documentation which discusses this capability. Fine-grained authorization | InfluxData Documentation Archive

Trying to limit access by creating a lot of databases will end up causing performance issues as the number of databases rises and is not recommended.

1 Like