Export big data from influxdb

Did you try adding epoch=s in the url ? Like this for example: query?db=mydb&epoch=s

(and thank you for your answer that helps me too !)

Thank you, aspyk.
epoch=#timeunit seems to do it.

I had the same issue exporting large amounts of CSV data where I had to chunk the queries into intervals, and I ended up writing a tool which does all of it automatically while managing RAM explicitly: GitHub - matthewdowney/influxdb-stream: Pull data out of InfluxDB in chunks and write to CSV files or write them back to Influx databases or measurements.

You give it a config like this:

{;; The InfluxDB database to connect to
:host          "127.0.0.1"
:port          8086
:db            "marketdata"

;; Fetch all rows for this measurement, between the start and end dates,
;; making queries spanning :interval amounts of time. The :interval is
;; important because it imposes a bound on InfluxDB memory usage for a
;; single query. The $timeFilter is replaced with a time range expression
;; according to where in the time range the cursor is, and a LIMIT is
;; appended to the query.
:query         "SELECT * FROM trade WHERE $timeFilter"
:query-limit   20000 ; max rows returned per query
:start         #inst"2020-01-01"
:end           #inst"2020-02-01"
:interval      [24 :hours]

;; Write a certain number of rows per file to a series of files named with
;; the given pattern, which accepts the timestamp of the first row.
:date-format   "YYYY-MM-dd"
:file          "trade.%s.csv"
:rows-per-file 10000}
1 Like