I’m having some issues when querying an instance of influxdb OSS 2.0.5, where the following query:
query = f'from(bucket: "{bucket}") \
|> range(start: {from_date}, stop: {today.isoformat()}T23:00:00Z) \
|> filter(fn: (r) => r._field == "prob") \
|> aggregateWindow( \
every: 1h, \
fn: count \
)'
res = client.query_api().query_data_frame(query=query, org=org)
…which returns the following error:
---------------------------------------------------------------------------
FluxCsvParserException Traceback (most recent call last)
<ipython-input-63-5125eec60c07> in <module>
----> 1 res = client.query_api().query_data_frame(query=query, org=org)
2 res
~/projects/tectal/notebooks/tectalenv/lib/python3.8/site-packages/influxdb_client/client/query_api.py in query_data_frame(self, query, org, data_frame_index, params)
145
146 _generator = self.query_data_frame_stream(query, org=org, data_frame_index=data_frame_index, params=params)
--> 147 _dataFrames = list(_generator)
148
149 if len(_dataFrames) == 0:
~/projects/tectal/notebooks/tectalenv/lib/python3.8/site-packages/influxdb_client/client/flux_csv_parser.py in generator(self)
69 """Return Python generator."""
70 with self as parser:
---> 71 yield from parser._parse_flux_response()
72
73 def _parse_flux_response(self):
~/projects/tectal/notebooks/tectalenv/lib/python3.8/site-packages/influxdb_client/client/flux_csv_parser.py in _parse_flux_response(self)
113 table_id = -1
114 elif table is None:
--> 115 raise FluxCsvParserException("Unable to parse CSV response. FluxTable definition was not found.")
116
117 # # datatype,string,long,dateTime:RFC3339,dateTime:RFC3339,dateTime:RFC3339,double,string,string,string
FluxCsvParserException: Unable to parse CSV response. FluxTable definition was not found.
I suspect the error is telling me that the requested bucket is not found on the server. However, this exact same query works with influxdb cloud, and copying the query string over to Postman and inserting it into a HTTP call, posting it to the /query
endpoint on the same server works as well.
The only difference between using the HTTP endpoint and the python client I can spot is the quote escaping:
In my Postman call, I had to escape them manually before posting, using a single escape like this: \"
.
On the call I’m sending using the python sdk, I notice that quotes are automatically double escaped like so: \\"
Could this be the cause of my issue? If so, how do I manage to single escape quotes with the python sdk?
Below is my POST body (working):
{
"extern":
{"imports": [], "body": []},
"query": "from(bucket: \"default-bucket\") |> range(start: 2021-09-06, stop: 2021-10-08T23:00:00Z) |> filter(fn: (r) => r._field == \"prob\") |> aggregateWindow(every: 1h, fn: count)",
"type": "flux",
"params": {},
"dialect": {
"header": true,
"delimiter": ",",
"annotations": [
"group"
],
"commentPrefix": "#",
"dateTimeFormat": "RFC3339"
},
"now": "2019-08-24T14:15:22Z"
}
Here’s the output from the python client in debug mode:
send: b'{"extern": {"imports": [], "body": []},
"query": "from(bucket: \\"default-bucket\\")
|> range(start: 2021-09-06, stop: 2021-10-08T23:00:00Z)
|> filter(fn: (r) => r._field == \\"prob\\") |>
aggregateWindow(every: 1h, fn: count )",
"dialect": {"header": true, "delimiter": ",",
"annotations": ["datatype", "group", "default"],
"commentPrefix": "#", "dateTimeFormat": "RFC3339"}}'
I’m on the latest Python influxdb-client library.