Difficulties in ingesting large amount of data in annotated CSV files

I’m bulk importing some existing data from Postgres into InfluxDB v2.

Originally I’d query the postgres table daily for some time series value and generate annotated CSV files that I’d subsequently ingest into Influx from the command line using influx write.

This works fine for files up to around 100k lines but when I try to ingest larger files either individually or in sequence (I’m using a shell script to call influx write repeatedly) I get the following error:

Error: Failed to write data: unexpected error writing points to database: engine: write /Users/account/.influxdbv2/engine/wal/76d17c4d98ca948a/autogen/3/_00001.wal: file too large.

I’ve searched around a bit and can’t find anything useful. I’m using InfluxDB v2.0.3.

I’ve tried playing around with the number of tags I have (usually 2) and their range of values (over a thousand down to one) and the result is the same.

I tried ingesting in batches of 5,000 and it’s usually after 20-25 rounds that this happens. I do suspect the ingestion goes fine but something I’m unaware of ‘fills up’ and prevents me from ingesting anything until I wait a prolonged period of time or restart Influxd itself.

I also tried taking the converted Line Protocol format using dryrun from influx write and ingesting via the Influx UI, same error message.

Conclusion: it’s not the format I’m ingesting it, it’s something inside.
What am I missing? :slight_smile:
Any ideas?

Ok, after some research into this I didn’t find a solution for my own installation - downloading the executable and installing manually under macOS Big Sur - but by downloading a Docker image v2.0.4 it works just fine with every method I’d previously tried.

There wasn’t a v2.0.4 official build on the Influx site so I took the latest nightly and it seems to work fine by just updating the executables (Influxd / influx) but using the existing database data.

Hope this is useful for anyone having difficulties, it certainly was a learning experience for me. The latest build looks like it fixes the issue I was having.