Recent edition of InfluxDb (2.0.2) - Last() function not work as expected

We have updated our database from 2.0.0rc to 2.0.2. In 2.0.2 the last()-function does not work as expected. In 2.0.0rc it returns the final record. In 2.0.2 it does not! I have used below script in both 2.0.0rc and 2.0.2 and they provide me with different results.

start = 2019-01-01T00:00:00Z
stop = 2019-05-01T00:00:00Z
from(bucket: "TestLastFunction")
  |> range(start: start, stop: stop)
  |> filter(fn: (r) => r["_measurement"] == "Electrical")
  |> filter(fn: (r) => r["MeterNumber"] == "IncreaseValuePerDayFixedRange")
  |> filter(fn: (r) => r["_field"] == "Consumption")
  |> last()

I have created a file to replicate the issue - unfortunately, as a new user I am not allowed to upload it to this post. The file contains >8000 lines similar to lines below. The timestamp is in milliseconds.

Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0 1546300800000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0104166666666667 1546301700000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0208333333333334 1546302600000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0312500000000001 1546303500000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0416666666666668 1546304400000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0520833333333335 1546305300000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0625000000000002 1546306200000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0729166666666669 1546307100000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0833333333333336 1546308000000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.0937500000000003 1546308900000
Electrical,MeterNumber=IncreaseValuePerDayFixedRange Consumption=0.104166666666667 1546309800000

Link to file (onedrive)

I think by default it groups by time and that is why it shows you the last () for each different time.

Try this, grouping by the tag:

from(bucket: "TestLastFunction")
  |> range(start: start, stop: stop)
  |> filter(fn: (r) => r["_measurement"] == "Electrical")
  |> filter(fn: (r) => r["_field"] == "Consumption")
  |> filter(fn: (r) => r["MeterNumber"] == "IncreaseValuePerDayFixedRange")
  |> group(columns: ["MeterNumber"])
  |> last()

And if you want to show the trend of the points on a graph. Add before last () this:

  |> aggregateWindow(every: 15m, fn: last, createEmpty: false)  

I have put 15m because I think your data is collected every 15 minutes.

1 Like

Adding the group()-function does fix my issue!!! So thank you.

However, I still consider the behavior as faulty, because the last()-function works different in 2.0.0 and 2.0.2. I would also like to clarify that in 2.0.2, the last()-function only returns one sample… Secondly, all records are in the same table (table 0) so - to my understanding - the group()-function should not be necessary.

1 Like

I just encountered this issue. I’m retrieving metric values in timestamp order, and last() worked correctly for a few days (returning a single most recent value), then stopped working (returning a single older value). There were no configuration changes made to the database or to the query during the time between when it was working and when it stopped working.

Adding a group() statement to the query has restored the expected behaviour.

I can reproduce this oddness with my dataset in the data explorer, and have observed the following:

a) first() always works, with and without a group() - returns _time 2020-12-13T22:37:33.494Z
b) last() without the group() returns _time 2020-12-13T23:59:55.833Z
c) last() with the group() returns _time 2020-12-16T09:10:57.566Z

There are over a thousand records between 2020-12-13T22:37:33.494Z and 2020-12-13T23:59:55.833Z, and tens of thousands of records after that timestamp. Nothing unusual at that time, too. A full query shows results before, at, and after that time.

Definitely something odd going on.

A patch has been submitted that fixes this problem, and should be included in the next release. Patch can be found here: