JBoesh
November 1, 2022, 3:54pm
1
Hi
I need some help creating a query in fluxdb2
I can get 2 values back at “_value” and that is TRUE or FALSE.
What I would like is a query that only sums the value TRUE (sum) (per day).
so that I have a sum of all times the value is set to TRUE in a day.
raw data:
When I execute this query, it counts the state values, but I want to exclude the FALSE ones.
from(bucket: “home_assistant”)
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r[“_measurement”] == “binary_sensor.deurbel_ringing_sensor”)
|> filter(fn: (r) => r[“_field”] == “state”)
|> count()
|> yield(name: “count”)
anyone can help? thanks in advance!!
scott
November 1, 2022, 4:48pm
2
Just filter by r._value == "True"
:
from(bucket: "home_assistant")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "binary_sensor.deurbel_ringing_sensor")
|> filter(fn: (r) => r["_field"] == "state" and r["_value"] == "True")
|> count()
|> yield(name: "count")
JBoesh
November 2, 2022, 10:33am
3
this indeed gives me the number (sum) of how many times “True” has occurred, but ultimately I want to see this in a graph in a timeline.
Do you have an idea how I can solve this?
At “_time” I see when “True” has occurred. but Grafana doesn’t seem to recognize this as date format?
scott
November 2, 2022, 7:08pm
5
@JBoesh Ok, you want to use aggregateWindow()
to window your data into specified time intervals and return the count for each window:
from(bucket: "home_assistant")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "binary_sensor.deurbel_ringing_sensor")
|> filter(fn: (r) => r["_field"] == "state" and r["_value"] == "True")
|> aggregateWindow(every: v.windowPeriod, fn: count)
|> yield(name: "count")
Aggregate functions like count()
actually drop the _time column because the functions doesn’t know what timestamp to use for the aggregate value. But using aggregateWindow()
will solve that for you.
1 Like
JBoesh
November 3, 2022, 7:49am
6
great! thank you! looks like its working
1 Like