Looking for guidance on downsampling based on previous data

Hello,
I am in the early learning steps of influxDB2 to store data from my lab. At this point, I know how to store high resolution data in a bucket (24 hours retention period) and I can store a downsampled version (average over a fixed time) into another bucket. However for my application, I would like to downsample based on a given deviation from a previous value.
An example of what I mean is the following. Assume a series of high resolution data (ignoring timestamps) like: 100; 101; 102; 104; 103; 104; 103; 105; 106 and a deviation of +/1 the downsampled data would be 100; 102; 104; 106. While there is a jump larger than 1 between 103 and 105, the last downsampled value stored being 104 the change is not larger than 1.

In addition, I would like to be able to associate a different deviation for each fields in my bucket. Is there a way to have a “lookup” table (another bucket?) that stores the information and to automagically add fields to the task?

I searched around but only found “Standard functions” approaches being discussed. Can you point me in the right direction to achieve this kind of downsampling?

Hello @mcouder,
I’m not sure how you could apply this custom function with Flux. Flux doesn’t support for loops so I don’t think it’s possible.
I’m sorry.
However you can always use the Client Libraries and perform this type of downsampling in another language and use something like lambda to execute the logic.

Hi @Anaisdg,
Thank you for the answer. Can you point me towards the client libraries that could help with this kind of task?

Also, forgive me for my naive, newbie understanding but, I don’t understand why the method I am hoping to use is not more wildly used in time-series applications. I am hoping that I am missing a simple alternative. If you have the time, I would appreciate if you could look at my user case below and maybe point me in a direction that I did not envision for the downsampling.

In my application, values of measurements are stable for a long time and the absence of change is not too critical. However, when things changes I would like to maintain a chosen level of granularity. For example I used magnets, the current command and read back to and from the power supplies are regularly recorded together with the measured magnetic fields. During an experiment, after a field (current) change, a key information is how the field stabilizes to its “final” value and whether any power power supplies glitch took place during the change. After that, things can be stable for hours. I wished I could preserve the information of the change while ignoring the much larger amount of identical data.