Unable to fully load CSV with millions of rows

Hello, I have a csv file containing 10 millions of rows, that i need to load into influxdb running inside a docker container. The csv is this: x.csv - Google Drive. The problem is that It seems that only around 16k rows are being inserted. The command I used is this: influx write -b demo -f "x.csv" \ --header "#constant measurement,observations" \ --header "#constant tag,granularity,5" \ --header "#constant tag,region,Bruxelles" \ --header "#datatype dateTime:number,double,long,double" \ --header "time,road_id,vehicle_count,average_speed". Could you please you please help?

Hello @fabio,
If your file has that many rows I recommend using:
Telegraf file input plugin

Or the python client library: