Partial write: field type conflict ...is type float, already exists as type integer dropped

Hi everyone,

=== information ===
telegraf version 1.25.3-1. /. influxdb version 1.8.10-1

telegraf.conf:
[[outputs.influxdb]]
urls = [“http://127.0.0.1:8086”]
database = “dbs”
skip_database_creation = true
timeout = “15s”

[[inputs.file]]
files = [“/meteo/xxx.csv”]
data_format = “csv”
csv_header_row_count = 0
csv_column_names = [“time”,“stname”,“NumR”,“Vbat”,“PTemp”,“T10m”,“RH10m”,“T2m”,“RH2m”,“WS2mSWVT”,“WDir2mD1WVT”,“WS10mSWVT”,“WDir10mD1WVT”,“SlrW”,“SlrkJTot”,“SR01Up”,“SR01Dn”,“IR01Up”,“IR01Dn”,“NR01TC”,“NR01TK”,“NetRs”,“NetRl”,“Albedo”,“UpTot”,“DnTot”,“NetTot”,“IR01UpCo”,“IR01DnCo”,“BPmbar”,“PPmm6mTot”,“PPmm0mTot”,“DT”,“Q”,“TCDT”,“DBTCDT”,“Pr1m”,“Wavg”,“PrTotal”]
csv_column_types = [“string”,“string”,“integer”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”,“float”]
csv_skip_rows = 0
csv_skip_columns = 0
csv_delimiter = “,”
csv_comment = “”
csv_tag_columns = [“stname”]
csv_timestamp_column = “time”

when starting telegraf, I´m getting many errors like:

E! [outputs.influxdb] Failed to write metric (will be dropped: 400 Bad Request): partial write: field type conflict: input field “PPmm0mTot” on measurement “file” is type float, already exists as type integer dropped=3221

I completely rewrite the input csv file, following csv_column_types, but telegraf fails to write anything to influxdb… I didn’t have that variable defined before (csv_column_types), but I realized that some variables were written in the csv file, sometimes as integers and other times as floats… that’s why I defined the variable csv_column_type and rewrote the file… but still, no succeeds in reading the csv file correctly…

csv file-- 2022-06-04 15:24:00,TresPuntas,128212,11.960,-4.688,-79.420,0.356,-7.887,90.900,1.971,175.400,2.388,178.100,162.300,9.735,165.700,131.400,-7.614,-0.390,-6.691,266.500,34.320,-7.225,0.793,158.100,131.000,27.090,278.200,285.400,648.668,0.000,0.000,4.881,226.800,4.111,1.065,0.030,7999.000,49.610

2022-06-04 15:25:00,TresPuntas,128213,11.970,-4.693,-79.470,0.426,-7.904,90.900,2.334,199.600,3.053,193.000,163.800,9.829,167.200,132.700,-7.961,-0.465,-6.656,266.500,34.560,-7.495,0.793,159.300,132.200,27.070,278.000,285.500,648.661,0.000,0.000,3.974,184.200,3.346,1.830,0.029,7999.000,49.640

2022-06-04 15:26:00,TresPuntas,128214,11.960,-4.698,-79.440,0.356,-7.935,90.800,3.141,165.300,3.126,168.500,171.300,10.278,174.800,138.700,-8.230,-0.781,-6.650,266.500,36.020,-7.446,0.794,166.500,138.000,28.570,277.800,285.200,648.542,0.000,0.000,3.446,166.200,2.902,2.274,0.024,7999.000,49.660

any ideas or suggestions?
I am totally lost with this issue …

thanks in advance …

dos2unix and “int” instead of integer (csv_comumn_types) solve the problem…

sorry !

2 Likes

Hello @rumekintun,
Thanks for sharing your solution :slight_smile: