Downsampling based on daily last data

I have a couple of measurements in a bucket. Data are collected every day (except on Thursdays and Fridays) from 9 AM to 13 PM, every 1 minute.
Is it possible to store the last daily data of each measurement in a new one? The timestamp of the last data is not always the same, i.e. one day it’s 12:59:30 and another day is 12:58:49.

Without looking in detail I would do the following:

  • construct a field my_date containing only the date using map()
  • group your data by my_date
  • sort by _time in reverse (so the last reading is at the top)
  • use |> top(n:1 … to return only the last data for each date
  • use |> to() to write it to a bucket

Do you mean something like this?

import "date"

from(bucket: "stocks")
    |> range(start: -72h)
    |> filter(fn: (r) => r["_measurement"] == "some_name")
    |> filter(fn: (r) => r["_field"] == "last")
    |> map(
        fn: (r) => ({r with
            _date: date.truncate(t: r._time, unit: 1d),
            _value: r._value,
            _name: r._measurement,
        }),
    )
    |> group(columns: ["_date"])
    |> last()

This works great on one measurement. Can we perform this on all measurements in one bucket?

Changing the group solved the problem:

import "date"
from(bucket: "stocks")
    |> range(start: -72h)
    |> filter(fn: (r) => r["_field"] == "last")
    |> map(
        fn: (r) => ({r with
            _date: date.truncate(t: r._time, unit: 1d),
            _value: r._value,
            _name: r._measurement,
        }),
    )
    |> group(columns: ["_measurement", "_date"])
    |> last()

Nice!
Couldn’t have done it better myself!