import "array"
import "experimental/json"
import "experimental"
array.from(rows: [{_time: experimental.subDuration(from: now(), d: 30m), _value: "69"}, {_time: now(), _value: "1"}])
|> map(fn: (r) => ({r with _value: json.parse(data: bytes(v: r._value))}))
Now, if some of the JSON was not JSON at all, but garbage, I wouldn’t even be able to use exists
hacks to filter it out.
Are there any plans to implement error handling so that I can filter off things like that while retaining the healthy rows in my query?
Hi @qm3ster,
I have passed this on to the product management board for comment. So sadly I don’t believe there is an out the box solution at the moment. Currently, you could do some clever conditional arguments which check the characteristics of your input before parsing the json.
I believe the language will need some kind of error handling in the long run, be that via a Result
type or some kind of catching exceptions, but I see it isn’t possible right now.
Anyway, my data massaging is somewhat beyond the reach of Flux and even Telegraf at the moment, so I will continue to run my own executables.
1 Like