Continuous queries and downsampling

That might cause you headache.

When you put downsampled data to the same table as original, do you delete the original one?
If you do, there will be too many “delete” operations, which, I think, not healthy to a time-series database.

If you do not delete, how do you differentiate original data and downsampled one, when they are at the same time point?

For example, you have a data at 00:00:00. Because you “degrade” to 15min resolution, you will take all data from 00:00:00 to 00:15:00, get average and write back to 00:00:00. What happen if new data is write to the same “slot” of old data?