High, when i use my Python Script wich stores a lot of data in influxdb my system crash. Influxdb is using 100 % of ram. I am using docker compose with a Google Cloud Server (2 CPU / 8GB ram).
Does anyone have same problems of a solution?
High, when i use my Python Script wich stores a lot of data in influxdb my system crash. Influxdb is using 100 % of ram. I am using docker compose with a Google Cloud Server (2 CPU / 8GB ram).
Does anyone have same problems of a solution?
Besides this I tried to ensure that my database Docker container has not more than 6gb. It works that the system does not freeze. But Docker container is restarting again and again because he reach ram limit… I also tried this config yml, but no success…
data:
cache-max-memory-size: 256m
max-concurrent-compactions: 2
wal-fsync-delay: 100ms
memory-size: 1g
query-memory-bytes: 256m
Not sure how you define ’a lot’ of data.
I have no exact idea of the intern workings of influxdb but from what I have read and looking at people’s problems like high ram and crashes, I believe that it needs some time to actually process the data.
While it will gladly except data at high rates, it will buffer it in memory and writes it async to storage, processing and indexing.
If you feed too fast, the processing cannot keep up and the buffer grows…until the memory is exhausted.
I would suggest to feed slower, or provide more CPU power or have a lot more ram. Or a combination of all.
@pvflow Did you get this ever resolved?
Yes I decided to use VictoriaMetrics ![]()