I hope you can help me with a behaviour that I do not fully understand.
I have various sensors sending their values at non-regualar intervals; mostly they just send a value on changes.
For some of those values, I need to show aggregated values but a time weighted average would be much better suited that a plain average.
Basically: I want to split the data in slots of one minute; in one minute there can be zero, one or more values; for each 1 minute slot I want to calculate the weighted average based on how long a value was set. For slots with no values, I expect the last seen value to be used as its only value.
However, doing some tests with the
timeWeightedAvg() function I get some very weird results.
You can find an example of data in the attachment.
With a query like:
from(bucket: "test_items") |>range(start: 2022-01-01T00:15:00Z, stop: 2022-01-01T00:25:00Z) |> filter(fn: (r) => r["_measurement"] == "items") |> filter(fn: (r) => r["_field"] == "temp") |> window(every: 1m, createEmpty: true) |> timeWeightedAvg(unit: 1m)
I get this output:
And I really do not understand how those values are calculated: given the data that I have, they make no sense to me.
Where can I find the details about how timeWeightedAvg() works?
Do you know how to fix the query for my needs?