Error while writing large line protocol file

I’m getting parsing error when I try to write a 1.3 GB text file with data points in line protocol format.
If I pick few 100 lines of the file and upload as a separate file, it completes without any error.
Data file is written by a test java program and writes uniform data so it’s unlikely that there is some formatting mistake in the file.
Error :
Command: influx write -b TestBucket -o INFA -p ms @/data/cloudqa/influx/data30.txt
Error: Failed to write data: http/handleWrite: unable to parse ‘82’: missing fields
unable to parse ‘process_count,tenant_id=Tenant2,process_name=Proc30,status=Faulted pcount=130 1559928204082process_count,tenant_id=Tenant1,process_name=Proc87,status=Faulted pcount=187 1559928204082’: bad timestamp
unable to parse ‘1559928204082’: missing fields
unable to parse ‘process_count,tenant_id=Tenant6,process_name=Proc64,status=Suspended pcount=164process_count,tenant_id=Tenant6,process_name=Proc22,status=Suspended pcount=122 1559928204082’: invalid number
unable to parse ‘process_count,tenant_process_count,tenant_id=Tenant8,process_name=Proc82,status=Faulted pcount=182 1559928204082’: missing tag value
unable to parse ‘pcount=169 1559928204082’: invalid field format
unable to parse ‘process_count,tenant_id=Tenant9,process_name=Proc70,status=Completeprocess_count,tenant_id=Tenant9,process_name=Proc27,status=Faulted pcount=127 1559928204082’: duplicate tags

Hi @Yogesh welcome !

how many lines does the file contain ?
can you extract the first few lines that give an error from the text file and post the content ?

best regards ,

Hi Marc,

Thanks for responding.

When file upload did not work I looked for other options and found the java client. It’s a bit slow but seems to work. So I can probably live with the file upload issue for now.

To answer your question, issue cropped up at about line 130. Below are few lines from the file:
process_count,tenant_id=Tenant1,process_name=Proc1,status=Faulted pcount=101 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc2,status=Suspended pcount=102 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc3,status=Suspended pcount=103 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc4,status=Faulted pcount=104 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc5,status=Completed pcount=105 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc6,status=Suspended pcount=106 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc7,status=Faulted pcount=107 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc8,status=Suspended pcount=108 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc9,status=Completed pcount=109 1559928504688
process_count,tenant_id=Tenant1,process_name=Proc10,status=Completed pcount=110 1559928504688

@Yogesh , maybe there are newline character in the file ?

I get the same kind of errors when I try …

> insert 151151515125151
ERR: {"error":"unable to parse '151151515125151': missing fields"}

> insert pcount=169 1559928204082
ERR: {"error":"unable to parse 'pcount=169 1559928204082': invalid field format"}

> insert process_count,tenant_id=Tenant9,process_name=Proc70,status=Completeprocess_count,tenant_id=Tenant9,process_name=Proc27,status=Faulted pcount=127 1559928204082
ERR: {"error":"unable to parse 'process_count,tenant_id=Tenant9,process_name=Proc70,status=Completeprocess_count,tenant_id=Tenant9,process_name=Proc27,status=Faulted pcount=127 1559928204082': duplicate tags"}