Memory consumption on bulk write

Hello,

I want to start using InfluxDB to replace a Postgres database for some data.
I wrote a NodeJS script that extract old data from PG and inject around 20M lines 500k at a time.
Each entry contains 2 fields and 3 tags.

At first, I tried on a dedicated server with 4GB memory, but my Influx service got OOM killed. I then upgrade my server to 8GB and the script success most of the time. However, my memory consumption go to nearly 8 GB.
After the end of the injection, the memory go back to less than 1GB in 2 minutes.
Even if my script eventually finish, the memory consumption seems to be uncontrolled.
What are the process in action that could use so much memory ?
Is there a way to limit the memory usage ?

I also tried to remove a tag, and the memory is much lower.

Do you have any advice to cap the memory usage ?
Is InfluxDB usable in production with this kind of uncontrolled usage of the memory ?

Thanks.

Hi @Nicolas_LASSIMONNE,
I have attached a rough table of system requirements. Please note that the lingering memory usage is probably the ingested data being compressed and reindexed for storage. You are hitting an 8GB machine pretty hard.