Bump: anybody a suggestion how to solve this?
Then, just increase the range of the query:
data = from(bucket: "telegraf")
|> range(start: experimental.subDuration(d: 1d, from: now()))
|> filter(fn: (r) => r["measurement"] == "solar")
|> filter(fn: (r) => r["topic"] == "import")
Using this, you will be able to collect the last day of data. Then, when you aggregate by hour and apply the difference, it will calculate it even if the time elapsed is more than an hour.
If you want to optimize it, you can also apply tail(n: 2) so it only takes into account the last two entries.
Let me know how this goes.
Hi, thanks for your thoughts.
Yes this is indeed also what I have done now. I have created a task that runs daily to aggregate the last 24 hours. This repairs the “holes”.
That works….until I have a day or week e.g. during vacations when there is no usage during more then 24 hours.
Indeed, querying by time ranges can result inconvenient for certain applications. Nevertheless, you can always query larger ranges of data and keep the last 2 entries using tail(). That way you will make sure that you are retrieving data even after long holidays, while the operations remain optimal since you are working with only 2 records.
If anyone else has any other option, feel free to comment.
Hi,
I have met the same problem. when I ran task manually, would insert data to “control” bucket. There is no data insert into “control” bucket when ran task automatically, but logs record as sucessfully. Below is my task , anyone can help?
option task = {
name: "control_power_forever",
every: 15m,
}
control = from(bucket: "forever")
|> range(start: -task.every)
|> filter(fn: (r) => r._measurement == "T3_5s")
|> filter(fn: (r) => r["_field"] == "control_power" and r.type =="2" )
|> aggregateWindow(every: 15m, fn: last)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> group(columns: ["_time"], mode:"by")
|> unique(column: "control_power")
original = () => {
meter = from(bucket: "forever")
|> range(start: -task.every)
|> filter(fn: (r) => r._measurement == "T3_5s")
|> filter(fn: (r) => r["_field"] == "ac_history_negative_power_in_kwh" and r.type =="2" )
|> aggregateWindow(every: 15m, fn: last)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> difference(columns:["ac_history_negative_power_in_kwh"])
|> group(columns: ["_time"], mode:"by")
|> map(fn: (r) => ({ r with ac_history_negative_power_in_kwh: r.ac_history_negative_power_in_kwh * 4.0 }))
|> sum(column:"ac_history_negative_power_in_kwh")
ems = from(bucket: "forever")
|> range(start: -task.every)
|> filter(fn: (r) => r._measurement == "T1_5s")
|> filter(fn: (r) => r["_field"] == "ems_history_output_energy" )
|> aggregateWindow(every: 15m, fn: last)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> difference(columns:["ems_history_output_energy"])
|> group(columns: ["_time"], mode:"by")
|> map(fn: (r) => ({ r with ems_history_output_energy: r.ems_history_output_energy * 4.0 }))
|> sum(column:"ems_history_output_energy")
return join( tables: {meter, ems},on: ["_time"],)
|> group(columns: ["_time"], mode:"by")
|> map(fn: (r) => ({ r with originalDemand: r.ems_history_output_energy + r.ac_history_negative_power_in_kwh}))
}
originalDemand = original()
join( tables: {originalDemand, control}, on: ["_time"], )
|> map(fn: (r) => ({r with max : if ( r.control_power > r.ac_history_negative_power_in_kwh) then (r.originalDemand - r.control_power) else (r.originalDemand - r.ac_history_negative_power_in_kwh)}))
|> map(fn: (r) => ({r with _value : if r.max > 0 then r.max else 0.0}))
|> map(fn: (r) => ({r with _field : "real"}))
|> drop(columns: ["ammeterId","uuid"])
|> to(bucket: "control")