When I enter the following query:
SELECT mean("close") AS "mean_close" FROM "StockData"."autogen"."intraday_data" GROUP BY symbol FILL(null)
time intraday_data.mean_close symbol
1969-12-31T18:00:00.000-06:00 285.253120586821 QQQ
1969-12-31T18:00:00.000-06:00 331.414171115378 SPY
But when I try to list the symbol tag values as a query:
SHOW TAG VALUES ON "StockData" FROM "intraday_data" WITH KEY = "symbol"
“Your query or command is syntactically correct but returned no results.”
How does the select know about the symbol tag but as soon as I ask it for a list of known symbol values it comes up empty?
I’ve tested the same command and I get the correct result.
Where are you running the command? can you try it on the CLI?
The error itself usually happens when you look for a key that does not exist (due to misspelling maybe)
The only other check I usually do is about the connection, but you specified it in the command itself so that should not be an issue.
Thank you for taking a look. Maybe there’s something wrong in my data? Here is a complete example:
from influxdb import InfluxDBClient
influxClient = InfluxDBClient(host='192.168.1.13')
json_body = [
and the query:
SHOW TAG VALUES ON "TestDatabase" FROM "data" WITH KEY = "tagName"
Comes up with nothing. I’ve run this on both the CLI and the Chronograph interface.
I’ll be honest, it makes no sense… (or maybe I’m just blind)
which InfluxDB version are you running?
Can you run
SHOW TAG VALUES ON "TestDatabase" WITH KEY = "tagName", just out of curiosity, maybe removing the measurement filter will make it work.
I am running 1.6.4 on Ubuntu Server 20.10 (groovy).
The query returns nothing despite
SHOW TAG KEYS ON "TestDatabase" knowing about tagName.
Further, the server recognizes the cardinality of the Tag Values with
SHOW TAG VALUES CARDINALITY ON "TestDatabase" WITH KEY = "tagName" returning 1.0 and similar queries against my other databases returning reasonable values.
Might be a version bug…
I had a look around without finding too much but looks like other people got the same problem more or less randomly from v1.4 to v1.6
I suggest you update to the latest release v1.8.6