Line protocol works for few entries fails for large file

Dear Experts,

I am running InfluxDB 2.1.1 Server: 657e183 Frontend: cc65325 on a PC running Debian 11.
I am using the UI to upload old data I formatted to line protocol (Data / Buckets / Add Data / Line Protocol). The file looks like this:

sensors,device=growatt,sensor=voltage value=277.1 1481382628880000000
sensors,device=growatt,sensor=power value=0 1481382628880000000
sensors,device=growatt,sensor=today value=2700 1481382628880000000
sensors,device=miflora,sensor=temp value=20.3 1481382904741000000
sensors,device=miflora,sensor=moisture value=17 1481382904741000000

If I just add a few lines manually (leave the precision to nanoseconds), the data is written into the database. I know these are old records from 2016, but I have the retention policy set to Forever.

Next if I upload the entire file (9MB, 136K lines), I just get a generic error that it failed. I have 8 of these files, so already did lot of trimming so each file is less than 10MB.

Is there any way to tell which line has caused the error? I generated these lines in Excel using simple string concatenation, I am pretty sure there are no illegal characters, or for example spaces in the tag values.

Thanks a lot,
Csongor

I have given up on the standard line protocol import and I created a flow in Node-Red that reads each record in my database, converts each record to line protocol and feeds it into influxdb batch node. I migrated >3M records without any issues.

The upload in the UI is not very good in my opinion.
Since you mention Excel, I can think of a possible cause of the error:
The influx line protocol must have Unix linefeeds \n only!
But if there are Windows linefeeds \r\n due to the Excel manipulations, then the import fails - unfortunately with an unspecific error message.