How to aggregate non-regular values using a time weighted average

I hope you can help me with a behaviour that I do not fully understand.

I have various sensors sending their values at non-regualar intervals; mostly they just send a value on changes.

For some of those values, I need to show aggregated values but a time weighted average would be much better suited that a plain average.

Basically: I want to split the data in slots of one minute; in one minute there can be zero, one or more values; for each 1 minute slot I want to calculate the weighted average based on how long a value was set. For slots with no values, I expect the last seen value to be used as its only value.

However, doing some tests with the timeWeightedAvg() function I get some very weird results.

You can find an example of data in the attachment.

With a query like:

from(bucket: "test_items")
  |>range(start: 2022-01-01T00:15:00Z, stop: 2022-01-01T00:25:00Z)
  |> filter(fn: (r) => r["_measurement"] == "items")
  |> filter(fn: (r) => r["_field"] == "temp")
  |> window(every: 1m, createEmpty: true)
  |> timeWeightedAvg(unit: 1m)

I get this output:

And I really do not understand how those values are calculated: given the data that I have, they make no sense to me.

Where can I find the details about how timeWeightedAvg() works?

Do you know how to fix the query for my needs?

Hello @dadivus,
Time is weighted using the linearly interpolated integral of values in the table.

But I agree I don’t understand what’s going on here.

import "array"
array.from(rows: [{_time: 2022-01-02, _value: 1}, {_time: 2022-01-03, _value: 2}, {_time: 2022-01-04, _value: 3}])
|>range(start: 2022-01-01)
  |> timeWeightedAvg(unit: 1d)

gives 171.40402444422162
When I picture a straight line and take the integral for (1,1)(2,2)(3,3) I don’t get anywhere near that value.
Im asking around.
I created this issue:


Pardon, I forgot to include the test dataset: