Hello!
I have a query that worked as expected for over a month. Today in the morning it started to fail with an error which I can’t understand how to fix. To the best of my knowledge nothing changed, so I can’t figure out why query is failing.
Query that worked for a month and is now failing:
SELECT DATE_BIN_GAPFILL(INTERVAL '24 hours', time, '1970-01-01T00:00:00Z'::TIMESTAMP) AS time,
SUM(amount) / 100
FROM "u-buy-amount-v7"
WHERE time >= now() - interval '30 days' AND time >= '2023-09-12' AND time <= now() AND "type"='sponsorship'
GROUP BY 1
ORDER BY 1
Error:
Error while planning query: Optimizer rule 'handle_gap_fill' failed caused by Schema error: No field named "u-buy-amount-v7".type. Valid fields are "u-buy-amount-v7".amount, "u-buy-amount-v7".time.: rpc error: code = InvalidArgument desc = Error while planning query: Optimizer rule 'handle_gap_fill' failed caused by Schema error: No field named "u-buy-amount-v7".type. Valid fields are "u-buy-amount-v7".amount, "u-buy-amount-v7".time.
Screenshot of schema, which shows that type tag is there
If I remove AND-clause with type then query works (but obviously this is not the query I need):
SELECT DATE_BIN_GAPFILL(INTERVAL '24 hours', time, '1970-01-01T00:00:00Z'::TIMESTAMP) AS time,
SUM(amount) / 100
FROM "u-buy-amount-v7"
WHERE time >= now() - interval '30 days' AND time >= '2023-09-12' AND time <= now()
GROUP BY 1
ORDER BY 1
If I do below query it works, what confirms that type tag is in the schema and is handled by InfluxDB
SELECT *
FROM "u-buy-amount-v7"
WHERE time >= now() - interval '30 days' AND time >= '2023-09-12' AND time <= now() AND "type"='sponsorship'
Arrrgh …
Any help from the community, please?
Hello @wonderer,
Welcome!
I’m not sure what’s going on.
I’m asking around. I appreciate your patience.
Thanks a lot! I am eagerly waiting … : )
Hey @wonderer ,
I’m a developer at Influx. The error message looks like a bug. Could you share the schema of the u-buy-amount-v7
table w/ me so I can file an internal ticket with the relevant information? You can easily get the data types via SQL using:
SELECT *
FROM information_schema.columns
WHERE table_name = 'u-buy-amount-v7';
Hi!
So, I did this query:
SELECT *
FROM information_schema.columns
WHERE table_name = 'u-buy-amount-v7';
Screenshot with the result
CSV result of this query (sorry, I don’t know how to attach CSV files here or format this properly)
#group FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
#datatype string string string string unsignedLong string string string unsignedLong unsignedLong unsignedLong unsignedLong unsignedLong unsignedLong string
#default
table_catalog table_schema table_name column_name ordinal_position column_default is_nullable data_type character_maximum_length character_octet_length numeric_precision numeric_precision_radix numeric_scale datetime_precision interval_type
public iox u-buy-amount-v7 amount 0 YES Int64
public iox u-buy-amount-v7 sponsor-blockchain 1 YES Utf8 2147483647
public iox u-buy-amount-v7 sponsor-name 2 YES Utf8 2147483647
public iox u-buy-amount-v7 sponsor-uid 3 YES Utf8 2147483647
public iox u-buy-amount-v7 ti-blockchain 4 YES Utf8 2147483647
public iox u-buy-amount-v7 ti-name 5 YES Utf8 2147483647
public iox u-buy-amount-v7 ti-uid 6 YES Utf8 2147483647
public iox u-buy-amount-v7 ticker 7 YES Utf8 2147483647
public iox u-buy-amount-v7 time 8 NO Timestamp(Nanosecond, None)
public iox u-buy-amount-v7 type 9 YES Dictionary(Int32, Utf8)
I can confirm that this is indeed a bug. Filed an internal bug report.
Okay, this is good that there is a plan : ) But what are timelines? I have a production system which needs this query to work. Or to be more precise: what are my options until the bug gets fixed?
In this case the fix is hopefully available till the end of this week.
In general though urgent requests should go through support, NOT the community forum, since bug reports are triaged and prioritized in a central place.
End of this week - great!
Well, it is okay to go through support if one knows that this is a bug. In my case I was not sure as it might be something that I was doing wrong.