How to export large number of measurements from InfluxDB to csv?

We are trying to export a large number of rows (160.000.000+) from influxDB to a csv file. So far we are just blowing up the memory on the machine that runs the query. We have tried to use the influx cli and also a small python program but i am lost on how i could export this amount of rows without blowing the memory of the machine that runs the export. Any thoughts on this??

I just read this thread an thought you might try to split it in batches with using a time range and append the output to the csv?

influx -username myuser -password -database telegraf -precision rfc3339 -execute “SELECT * FROM mymeasurement WHERE time > ‘2017-12-23T05:25:21Z’ and time < ‘2017-12-24T05:25:00Z’” >> my.csv

There is a setting in InfluxDB, which will automatically split the data to chunk of 1000.

Does this even work yet? After 2 hours of googling and querying, no error messages are recieved, journlctl shows the query hit the database and return 200 yet no CSV file appears.

I posted this response to another question here about the same thing, so not trying to spam, but I ended up just writing a script to chunk the queries and writes into different CSV file automatically, otherwise influxdb would use up all the RAM and implode. GitHub - matthewdowney/influxdb-stream: Pull data out of InfluxDB in chunks and write to CSV files or write them back to Influx databases or measurements.