Continuous queries and downsampling


Hi All,

We want to downsample all of our raw measurements (resolution around 3s) older than 6 months into the same measurement with a resolution of 15 mins. After being alive for 5 years the data may be deleted.
I’ve seen many examples, but all of them put the downsampled data into a new database. That is not what we want. We want the resulting data to be in the same database so we can query it together with the raw data.
For example:
Make a graph that stretches from 7 to 5 months ago (the oldest data will have a resolution of 15min and the joungest 3s).

Could someone show me how to do this? (I mean: How do I create the retention policies and a continuous query that does the downsampling)

1 Like

Where do you see that?
It is normally put to new measurement (equivalent to table in SQL) and it is reasonable to do like that.

What you should do is:

  1. Create retention policy for 6 months, name it six_month and make it default.
  2. Given that your main table is named mydata, where you keep 3s-resolution data, put it to six_month RP.
  3. The default RP, autogen, which keeps data forever, will host your mydata_15m table.
  4. Create a CQ to do summary data and insert to autogen.mydata_15m table.

Hello Quan,

Thanks so much for your reply.
"all of them put the downsampled data into a new database"
I was using the wrong terminology there. What I mean is that the downsampled data is in another “namespace” or “table” so to speak…
What I want is for the downsampled data to be in the same “namespace” or “table”.
I want to use the downsampled and raw data in one and the same query.
After 6 Months the data should “degrade” to 15min resolution, that’s all I want.
I do not want this 15min resolution data to be in another place as the “raw” data.
Is it possible to do this?



That might cause you headache.

When you put downsampled data to the same table as original, do you delete the original one?
If you do, there will be too many “delete” operations, which, I think, not healthy to a time-series database.

If you do not delete, how do you differentiate original data and downsampled one, when they are at the same time point?

For example, you have a data at 00:00:00. Because you “degrade” to 15min resolution, you will take all data from 00:00:00 to 00:15:00, get average and write back to 00:00:00. What happen if new data is write to the same “slot” of old data?

1 Like