Hi, we are using influxdb-client==1.36.1 and write_api(write_options=SYNCHRONOUS) to a influxcloud hosted bucket. The payload of write is a dictionary. Occasionally we need to add a field with a base64 image (around the 200K characters). The code worked perfectly well on and before Jan 16, 2024 but any datapoint with such a large payload will not appear in the database afterward. I’ve checked the write always return None. I did not get the callback to work. I also see there is still balance in the account. What can be wrong in this situation?
I did some test with a new bucket. It seems if a field has more than 64*1024 characters, the datapoint get lost. I noticed the “Plain text string. Length limit 64KB.” in documentation and it seem the rule is strictly applied last week?
data_point = {
"measurement": "testBigPayload",
"tags": {
"container_id": "1",
},
"fields": {
"somedata": 1,
"someBigData": "j"*(64*1024+1*1),
},
"time": datetime.datetime.utcnow().isoformat("T") + "Z"
}
write_client = influxdb_client.InfluxDBClient(url=url, token=token, org=org)
write_api = write_client.write_api(write_options=SYNCHRONOUS)
try:
apiWriteResult = write_api.write(bucket=BUCKET, org=org, record=data_point)
except:
pass
print("influx API write error")
Hello @DeqingSun,
I believe you’re right.
However InfluxDB v3 has much better support for this.
Unfortunately my test in V3 api also fails.
point = InfluxDBClient3.Point("testBigPayload").tag("container_id", "1").field("somedata", 1).field("someBigData", "d"*(64*1024+1))
client.write(point)
This code does not throw any error but the data just won’t appear in the database