Batch load into remote influxDB using Python

Hi everyone.

I’m writing a Python script to collect lots of data from the Oracle database and load into InfluxDB in a remote server to build dashboards in Grafana.
My code works sweetly if, for each metric collected metric, the script loads it to InfluxDB. However, using this approach, for i.e. 10 metrics collected there will be 10 database calls.
I would like to avoid this approach and load all 10 metrics using a single database call, such as “influx -import”, but using Python and loading into a remote DB.

The code I’m using is:
v_payload = [
“measurement”: “SRV_HEATH”,
“tags”: {
“time”: v_curtime,
“fields”: {
“PCT_USED”: v_mempct

I already tried to store all the metrics into a list and then client.write_points the list and some other tries.

Does anyone have an example of how to achieve what I’m trying to?

Many thanks.

Hello @RicaRezende,
What version of the client are you using? This is the only client that’s maintained,
Perhaps it might be easier to convert your data into a DataFrame and write the whole DataFrame at once?

Thanks Anais.

Yes, I’m using the client you mentioned. :wink:

I’m not used to Panda. I’ll take a look on the link you mentioned and try this approach.

Many thanks.

Here’s another example of working with pandas and influxdb. In this blog, I take advantage of the pivot() function to easily create a DataFrame when I have multiple tags and fields.

Hi Anais.

Actualy I just realized that I’m using influxdb-python, and not influxdb-client-python.

I just installed influxdb-client-python but now I’m facing some difficulties.
What exactly is the organization and how “my-token” works?
I’m using InfluxDB 1.8


Hello @RicaRezende,
Please see this example in the client library for 1.8 compatibility.

Thanks for the example.
And sorry for all the questions, I’m really newbie in Python and InfluxDB.

I’m almost there. :slight_smile:
Python code looks to work, however, data were not inserted into InfluxDB.

My code:

from influxdb_client import InfluxDBClient, Point
username = ‘mon360’
password = base64.b64decode(env.INFLUXDBUSRPWD)
database = ‘xxxxxx’
retention_policy = ‘autogen’
bucket = f’{database}/{retention_policy}’

client = InfluxDBClient(url=‘xxxxxxxxxx:8086’, token=f’{username}:{password}’, org=’-’)
print(’*** Write Points ***’)
write_api = client.write_api()

v_cpupct = psutil.cpu_percent()
v_curtime = time.strftime("%Y-%m-%d-T%H:%M:%SZ%z", time.localtime())

point = Point(“SRV_HEATH”).tag(“CUSTOMER”, env.CUSTOMER).tag(“HOSTNAME”, env.HOSTNAME).tag(“RESOURCE”, “CPU”).field(“PCT_USED”, v_cpupct)

write_api.write(bucket=bucket, record=point)

So, there are 3 questions here:

  1. What is wrong that even without and execution error, no data is written into InfluxDB?
  2. Where, in the point, the time will be defined to be inserted?
  3. The example looks to be for a single point insert. How to change it to perform a bulk load?

Many thanks.

This topic was automatically closed 60 minutes after the last reply. New replies are no longer allowed.