Hi all!
I am downsampling a set of data which spans across midnight, and writing the results to a bucket using to()
. My goal is aggregate the data and specify the _time field as 00:00:00 of the day being processed. The problem I’m seeing is that the resulting time being stored in the bucket is different from that being passed to the to()
function.
In my current testing, I am filtering a collection of records between 2021-12-06T17:00:00Z
and 2021-12-07T08:00:00Z
(America/New_York timezone), then mapping the _time of all records to 2021-12-06T00:00:00Z
and aggregating them by 1d. This leaves me with the following result:
table | _measurement | _time | _value | _field |
---|---|---|---|---|
0 | pressure | 2021-12-06T00:00:00Z | 4.202409639 | mean |
1 | pressure | 2021-12-06T00:00:00Z | 4.057142857 | mean |
After writing these records to a different bucket using to(bucket: "data_agg")
, the time is being shifted from 00:00:00 to something else, in this case 00:30:00. Here’s the resulting data from the data_agg bucket:
_time | _value | _field | _measurement |
---|---|---|---|
2021-12-06T00:30:00Z | 4.202409639 | mean | pressure |
2021-12-06T00:30:00Z | 4.057142857 | mean | pressure |
I’m not sure why this happening, but I get the same result regardless of the many different methods of mapping _time, timezone, _stop and _start, shifting time, etc. that I have tried. I have reached a frustrating dead-end and really hoping someone can help me understand what is going on.
Code sample of the process:
from(bucket: "loradata")
|> range(start: 2021-12-06T17:00:00Z, stop: 2021-12-07T08:00:00Z )
|> filter(fn: (r) => r._measurement == "pressure")
|> map(fn: (r) => ({ r with
_time: time(v: "2021-12-06T00:00:00Z"),
}))
|> aggregateWindow(fn: xfn, every: 1d, createEmpty: false)
|> map(fn: (r) => ({ r with _time: time(v: "2021-12-06T00:00:00Z")}))
|> to(bucket:"data_agg")
Thanks!