Csv2influx: telegraf has no response and no error logs

Hi, everyone, I got some problems about telegraf and influxDB
This is my Telegraf Configuration:

[[inputs.file]]
  files = ["/root/test01.csv"]
  data_format = "csv"
  csv_header_row_count = 1
  csv_tag_columns = ["Channel"]
  csv_measurement_column = "Measurement"
  csv_timestamp_column = "Time"
  csv_timestamp_format = "unix"

And this is my CSV file:

Measurement,Channel,Time,Fields
14336_Tube,InPower,1698948568361,0.04411018730718063
14336_Tube,InPower,1698948568362,0.08892599555595807
14336_Tube,InPower,1698948568363,0.07441728064293723

no error log appears

and my bucket doesn’t get any data

Sorry for these beginners questions but I’m new into this world of telegraph and InfluxDB

The timestamp values in your data are in milliseconds; use the unix_ms format:

  data_format = "csv"
  csv_header_row_count = 1
  csv_tag_columns = ["Channel"]
  csv_measurement_column = "Measurement"
  csv_timestamp_column = "Time"
  csv_timestamp_format = "unix_ms"
1 Like

hi @Jason_Stirnaman
thank u for your suggestion
I changed my csv_timestamp_format = “unix_ms” and it works
but my bucket only has one piece of data

and I checked the output file:

1698948568,14336_Tube,InPower,hadoop100,0.04411018730718063
1698948568,14336_Tube,InPower,hadoop100,0.08892599555595807
1698948568,14336_Tube,InPower,hadoop100,0.07441728064293723
1698948568,14336_Tube,InPower,hadoop100,0.04411018730718063
1698948568,14336_Tube,InPower,hadoop100,0.08892599555595807
1698948568,14336_Tube,InPower,hadoop100,0.07441728064293723
1698948568,14336_Tube,InPower,hadoop100,0.04411018730718063
1698948568,14336_Tube,InPower,hadoop100,0.08892599555595807
1698948568,14336_Tube,InPower,hadoop100,0.07441728064293723

it doesn’t match the input file
any suggestions?

Ah, because each line has the same timestamp and tags, so you essentially update the point with a new value for each line. Sorry I didn’t catch that. To record each of these points in a series, you’d want to increase the timestamp precision in your data and then adjust your Telegraf timestamp format setting accordingly.

I set the ouputs.file:

csv_timestamp_format = "unix_ms"

and it works
But my bucket still only has one piece of data, Do I need to set anything else in the configuration?

All the lines in your data update the same field in the same point (database row).

No need to be sorry. I realize I misunderstood your previous question. Did you solve it?

1 Like

thanks for u suggestions, I’ve resolved the issue with the output file, and it now matches the content of my input file. But the bucket still only has one data entry. I was confused because my csv file has timestamps with ms precision, of course they has the same tag but they are three different timestamps.
the firest timestamp is 1698948568361, and the second one is 1698948568362, the last one 1698948568363
My csv:

Measurement,Channel,Time,Fields
14336_Tube,InPower,1698948568361,0.04411018730718063
14336_Tube,InPower,1698948568362,0.08892599555595807
14336_Tube,InPower,1698948568363,0.07441728064293723

I’ve set csv_timestamp_format = "unix_ms" in the Telegraf configuration, why is only one data entry showing up?Do i need to set anything else in the outputs.influxdb_v2?

Have you tried setting agent.precision to 1ms? The influxdb plugin has a precision option, but I think the influxdb_v2 uses the agent precision setting.

1 Like

here is my configuration

[agent]
  interval = "1s"
  round_interval = true
  metric_batch_size = 1000
  metric_buffer_limit = 10000
  collection_jitter = "0s"
  flush_interval = "1s"
  flush_jitter = "0s"
  precision = "1ms"
  hostname = ""
  omit_hostname = false
[[outputs.influxdb_v2]]
  urls = ["http://hadoop100:8086"]
  token = "$INFLUX_TOKEN"
  organization = "ipp"
  bucket = "test01"
  precision = "1ms"
[[inputs.file]]
  files = ["/root/test01.csv"]
  data_format = "csv"
  csv_header_row_count = 1
  csv_tag_columns = ["Channel"]
  csv_measurement_column = "Measurement"
  csv_timestamp_column = "Time"
  csv_timestamp_format = "unix_ms"
[[outputs.file]]  # only for debugging
  files = ["stdout","/root/test_out.csv"]
  data_format = "csv"
  csv_timestamp_format = "unix_ms"

I think it’s all configured, but there is still only one data in the bucket. I don’t know where the problem is.

but there is still only one data in the bucket
Your data and config look fine and I just verified they work for me. What’s your query? If you’re using the Data Explorer UI, make sure to uncheck any aggregations so you see all the points.

1 Like

I just reviewed your screenshot and this appears to be it–notice the timestamp is rounded to seconds.

I checked my query and now I can see my data, u really helped me!!
I didn’t know there were so many operations in the Data Explorer UI, looks like I need to learn more about FLUX