Where clause not working in InfluxDb

influxdb
#1

All,

Here are a few records from my series, Filename and Duration are field values

> select * from GTIF_FS2GTP_D where "Duration" > 15 limit 5
name: GTIF_FS2GTP_D
time                     Duration Filename
----                     -------- --------
2017-04-11T13:50:04.063Z 15.807   \\\\abc\\abcfiles\\group\\ftp\\CRMurexExport\\MxCR_20170411_13858\.txt
2017-04-11T13:50:04.093Z 16.014   \\\\abc\\abcfiles\\group\\ftp\\CRMurexExport\\MxCR_20170411_13859\.txt
2017-04-11T13:50:04.11Z  16.197   \\\\abc\\abcfiles\\group\\ftp\\CRMurexExport\\MxCR_20170411_13860\.txt
2017-04-11T13:50:20.91Z  15.36    \\\\abc\\abcfiles\\group\\ftp\\CRMurexExport\\MxCR_20170411_13861\.txt
2017-04-11T13:50:20.927Z 15.543   \\\\abc\\abcfiles\\group\\ftp\\CRMurexExport\\MxCR_20170411_13862\.txt

Where I tries to search for a point based on Filename value

> select * from GTIF_FS2GTP_D where "Filename" = '\\\\abc\\abcfiles\\group\\ftp\\CRMurexExport\\MxCR_20170411_13858\.txt'

I got this error
ERR: error parsing query: found \., expected identifier, string, number, bool at line 1, char 124

Please advise,

John

#2

@johnyung88 Can you fix your client writes so there are fewer \s in your data? This is an escaping issue.

#3

I un-escaped Filename values, I am still getting the same error

select * from GTIF_FS limit 3

name: GTIF_FS
time Count FileSize Filename


2017-05-01T02:01:36.407Z 1 876 \\abc\abcfiles\group\ftp\CRMurexExport\MxCR_20170501_21665.txt
2017-05-01T02:06:45.217Z 1 917 \\abc\abcfiles\group\ftp\CRMurexExport\MxCR_20170501_21666.txt
2017-05-01T02:06:50.317Z 1 917 \\abc\abcfiles\group\ftp\CRMurexExport\MxCR_20170501_21667.txt

select * from GTIF_FS where “Filename” = '\sorosfunds\sfmfiles\group\ftp\CRMurexExport\MxCR_20170501_21665.txt’
ERR: error parsing query: found \s, expected identifier, string, number, bool at line 1, char 56

The Filename field stores file in UNC folder and filename format, I don’t think I can use fewer \s. Can I tell Influxdb to do string comparison without consider \s?

Thanks

John