Exporting influxdb data as csv file in Influx-Python-Client

Hi there,

I am using influx 2.0 version and would like to export the data as csv files from the python client library. Any ideas on how I can do that in python?

Thanks
Ragini

1 Like

Hello @raginigupta6,
please check out this:

Specifically the .query_csv() method.
Thanks!
Lmk if you run into any problems and if it works for you, can you share your code example for future community users?
Thanks!

1 Like

Thank youu. I tried the following function but it’s not supported by the py client library:

client = influxdb.InfluxDBClient( .. )
points = client.query(query, chunked=True, chunk_size=10000).get_points()
dfs = pd.DataFrame(points)
...
for d in dfs:
    d.to_csv(filename, sep="\t")

Hi there,

I tried the query_csv() method (as mentioned here GitHub - influxdata/influxdb-client-python: InfluxDB 2.0 python client) but the client library does not support it either. Getting this error:

AttributeError: 'function' object has no attribute 'query_csv'

Hello @raginigupta6,
Silly question, but can you please share the code you tried? I just want to double check for a small typo anywhere.
Thank you.

Sure,

here it is:

from influxdb_client import InfluxDBClient, Point, WritePrecision
from datetime import datetime
from influxdb_client import InfluxDBClient, Point, WritePrecision
token = "Jxxxxxxxxxx"
org = "xxxx"
bucket = "txxx"
client = InfluxDBClient(url="http://xx.xx.xx.xx:8086", token=token)
query = ''' from(bucket: "timeseriesDB") |> range(start: -10m) |> filter(fn:(r) => r._measurement == "go_info") '''

csv_result = client.query_api.query_csv('from(bucket:"timeseriesDB") |> range(start: -1d)')
val_count = 0
for row in csv_result:
    for cell in row:
        val_count += 1
        print("value returned",cell)
Error:
AttributeError: 'function' object has no attribute 'query_csv'

Nevermind, the query_csv function works now.

Turns out that I had to correctly install the influxdb-client[ciso] package with Visual C++ build tools instead of just influxdb-client.

Also, I wanted to know if there is a limit on how much data it can fetch at a time?

Thanks again for your prompt response :slight_smile:

1 Like

Hello @raginigupta6,
Im happy it’s working.
@bednar might know about query limits–thanks!!!
But heres a blog that @raginigupta6 you might find interesting based off of @bednar’s work:

Thanks!!

1 Like

@raginigupta6, the query_csv should perform very well because the response is directly streamed into csv_reader

Thanks guys, I was looking for something like this.

I ended up also writing to a csv file:

# get query as csv
csv_result = query_api.query_csv(query=myQuery, org=myOrg)
#write to csv
csv_file = open(r'C:\temp\python\output.csv', "w",newline='')
writer = csv.writer(csv_file)
for row in csv_result:
     writer.writerow(row)
csv_file.close()