Starlark processing time in telegraf csv input

Trying to ingest CSV data via tail input in Telegraf.

My CSV data has a unusual timeformat: DATE Type | Microsoft Learn

I have written a starlark processor to handle it. It seems the starlark processing is happening after CSV try to parse the timestamp, and CSV input complains: parsing time “44744.916650” as “Unix”: cannot parse “44744.916650” as “Unix”

My telegraf config:

[[inputs.tail]]

files = ["/opt/influx-expedition.git/testdata/log/**.csv"]
from_beginning = true
path_tag="path"

data_format = "csv"
csv_header_row_count = 1
csv_comment = "!"


csv_timestamp_column = "Utc"
csv_timestamp_format = "Unix"


[[processors.starlark]]
 # Starlark script to convert Expeditions timeformat
source = '''
load("logging.star", "log")  # load logging library
def apply(metric):

  utc = metric.fields['Utc']

  floatUtc = float(utc)
  intUtc = int(utc)
  secSinceEpoch = -2209161600
  intPart = secSinceEpoch + (intUtc * 86400)
  floatPart = 86400 * (floatUtc - float(intUtc))

  result = intPart + int(floatPart)

  metric.fields['Utc'] = result
  log.debug("New metric obj " + str(metric))
  return metric

'''

Telegraf output:

Feb 18 06:51:06 influ telegraf[14089]: 2023-02-18T06:51:06Z E! [inputs.tail] Malformed log line in "/opt/influx-expedition.git/testdata/log/2022Jul02_0.csv": ["44744.916650,0,121.9,4.63,121.8,4.63,3.7,,0,0,0.01,241.9,,16,,115.09,0.88,0.49,-4.36,,,,,,,,,,13.1,,1,,9,,,,,59.321733,18.099383,0,0.01,,,,,,,,,,,,,,,,,,,37576.916667,96.9,276.9,,,,,,,,9.96"]: parsing time "44744.916650" as "Unix": cannot parse "44744.916650" as "Unix"

If I do not have the csv_timestamp_column and csv_timestamp_format specified in my telegraf config file, my data gets passed to Influxdb, but all into the same timestamp:

Telegraf debug:

Feb 18 06:56:15 influ telegraf[14133]: 2023-02-18T06:56:15Z D! [processors.starlark] New metric obj Metric("tail", tags={"path": "2022Jul02_0.csv"}, fields={"Bsp": 0, "Set": 0, "SeaTemp": 16, "Depth": 115.09, "Trim": 0.49, "Lon": 18.099383, "Cog": 0, "HPE": 9.96, "Awa": 121.9, "Drift": 0.01, "Heel": 0.88, "Rudder": -4.36, "GpQual": 1, "Twd+90": 96.9, "Utc": 1656799198, "Hdg": 241.9, "GpsNum": 9, "Twd-90": 276.9, "Sog": 0.01, "Aws": 4.63, "Leeway": 0, "Volts": 13.1, "Lat": 59.321733, "GPS Time": 37576.916667, "Twa": 121.8, "Tws": 4.63, "Twd": 3.7}, time=1676703375444157061)

My raw csv file:

Utc,Bsp,Awa,Aws,Twa,Tws,Twd,Rudder2,Leeway,Set,Drift,Hdg,AirTemp,SeaTemp,Baro,Depth,Heel,Trim,Rudder,Tab,Forestay,Downhaul,MastAng,FrstyLen,MastButt,Load S,Load P,Rake,Volts,ROT,GpQual,PDOP,GpsNum,GpsAge,Altitude,GeoSep,GpsMode,Lat,Lon,Cog,Sog,DiffStn,Error,StbRunnr,PrtRunnr
,Vang,Trav,Main,KeelAng,KeelHt,CanardH,Oil P,RPM 1,RPM 2,Board,Board 2,DistToLn,RchTmToLn,RchDtToLn,GPS Time,Twd+90,Twd-90,Downhaul2,Mk Lat,Mk Lon,Port lat,Port lon,Stbd lat,Stbd lon,HPE,RH,Lead P,Lead S,BackStay,User 0,User 1,User 2,User 3,User 4,User 5,User 6,User 7,User 8
,User 9,User 10,User 11,User 12,User 13,User 14,User 15,User 16,User 17,User 18,User 19,User 20,User 21,User 22,User 23,User 24,User 25,User 26,User 27,User 28,User 29,User 30,User 31,TmToGun,TmToLn,TmToBurn,BelowLn,GunBlwLn,WvSigHt,WvSigPd,WvMaxHt,WvMaxPd,Slam,Motion,Mwa,Mw
s,Boom Pos,Twist,TackLossT,TackLossD
44744.541114,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,59.3201,18.0887,58.656003,17.368613,58.655733,17.359067,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0
44744.541148,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541146
44744.541195,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541192,,,,59.3201,18.0887
44744.541242,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541238
44744.541288,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541285
44744.541335,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541331
44744.541382,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541377
44744.541429,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,44744.541424
44744.541452,0,137,16.08,136.9,16.1,,,,,,,,17,,90.73
44744.541464,0,137,16.08,136.9,16.1,,,,,,,,,,114.59
44744.541476,0,137,0,45,0,283.1,,0,0,0.01,238.1,,17,,99.49,0.82,0.54,-0.49,,,,,,,,,,,,1,,11,,,,,59.32175,18.099417,0,0.01,,,,,,,,,,,,,,,,45.8841,,,37576.54147,19,199,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.25518,,38.79

Any ideas how to transform a timestamp in a CSV file? What is the time attribute outside the field map in the metric object. Can I use that?

Yeah you have

csv_timestamp_column = "Utc"
csv_timestamp_format = "Unix"

telling the CSV parser to parse the Utc column as Unix timestamp. What you should do is to keep the Utc column as a string field and then do the conversion and time setting in starlark.