Timestamp column could not be found

Hello everyone!

I am having a few problems while writing a csv file to InfluxDB through Telegraf.

This is the part of the configuration file that applies to reading the csv:

  ## Data format to consume.
  data_format = "csv"
  csv_column_names = ["measurement","pressure","sensor","timestamp","id"]
  csv_column_types = ["string", "int", "string", "string", "int"]
  csv_delimiter = ";"
  csv_tag_columns = ["sensor","id"]
  csv_measurement_column = "measurement"
  csv_timestamp_column = "timestamp"
  csv_timestamp_format = "2006-01-02 15:04:05" 
  csv_timezone = "Europe/Madrid"

Here is an example of the csv file:

pressure;287;S6;2024-01-29 08:17:00.000;1
pressure;153;S6;2024-01-29 08:17:04.000;1
pressure;356;S6;2024-01-29 08:17:07.000;1

But Telegraf is outputing this problem:

It is strange since it has been working properly until I added environmental variables to the Telegraf configuration file. Also, when commenting out the lines regarding the timestamp, it reads the csv properly, assigning the current timestamp to the metrics.

Note: I am using the [[inputs.exec]] plugin to create the csv, but I only added the csv configuration section since it appears to be the source of the problem.

I have been blocked at this point for a few days, so if any of you could help me out, I would highly appreciate it.

Thank you in advance.

Are you sure you are getting what you think you are getting and not some error? That config works for me using the file input, so my guess is that some error is coming across and as a result the timestamp column is the first error you hit since it is not found.

You could use the value parser to print exactly what you are getting:

[[inputs.exec]]
  commands = ["your command here"]
  data_format = "value"
  data_type = "string"

[[outputs.file]]

and ideally you would see something like:

file value="pressure;287;S6;2020-01-29 08:17:00.000;1
pressure;153;S6;2020-01-29 08:17:04.000;1
pressure;356;S6;2020-01-29 08:17:07.000;1" 1706536699000000000

Hello,

Thank you very much for the prompt and useful response since the ‘value’ data format option has resulted to be really insightful.

As you mentioned, the file generated by the bash file was in the correct format. I just realised, by using data_format = "value", that I forgot to de-comment part of the code that redirected some lines printed out by the bcp utility when copying data. Thus, when the bash script was executed separately, the resulting file had the right metrics and format, but when executed within Telegraf, these extra lines were also included, resulting in the error.

This is what resulted in Telegraf after trying the data_format = "value"

> exec value="Starting copy...
Massively copied 1000 rows to the host file. Total received: 1000

1985 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total     : 172    Average : (11540.70 rows per sec.)
pressure;287;S6;2024-01-29 08:17:00.000;1
pressure;153;S6;2024-01-29 08:17:04.000;1
...
pressure;356;S6;2024-01-29 08:17:07.000;1" 1706702752000000000

As you can see, these information lines appeared at the top but, as I was not saving those in the file, they were going unnoticed.

Silly mistake but at least I learnt about the ‘value’ data format for future tests.

Thank you very much.

1 Like