Unable to write data to server

I am encountering an issue in which my data is not being written to my InfluxDB server. I’m guessing that my issue may be a result of not specifying WriteOptions (batch size, flush interval etc.), but I couldn’t quite get it to work even with modifying the defualt values.

Code snippet:

print("Conduct connection Health Check prior to parse: {}".format(client.health()))

write_api = client.write_api()
#query_api = client.query_api()

for element in data: # element are the entries in the TOP LEVEL JSON data object

dataInfluxDB=[] # List for data batch write - should not exceed 5000 points

# If JSON Field value is a Nested JSON
if (isinstance(data[element], dict)):
    checkDict(data[element], element)

    # This loop will go through every expiration date and strike price in the putExpDateMap and then callExpDateMap
    # A batch write of data will occur in this loop to package all the "PUT" data, send and 

    print(dataInfluxDB)
    write_api.write(bucket=bucket, org='-', record=[dataInfluxDB])
    print('Should have written this data')

# If JSON Field value is a string, float or integer. This will print the immediate key:value data in the first level object
elif (isinstance(data[element], (str, float, int, type, type(None)))): 
    pass

print("Result of connection close is: {}".format(client.close()))
print(time.process_time()-start)   # Report total time

Terminal Output:

Conduct connection Health Check prior to parse: {'checks': [],
'commit': None,
'message': 'ready for queries and writes',
'name': 'influxdb',
'status': 'pass',
'version': '1.8.4'}
['aapl,contract=PUT strikePrice=30.0,bid=0.0,ask=0.01,last=0.01,mark=0.01 1615582795743', 'aapl,contract=PUT strikePrice=35.0,bid=0.0,ask=0.01,last=0.01,mark=0.0 1615924799542']
Should have written this data
['aapl,contract=CALL strikePrice=30.0,bid=90.9,ask=91.15,last=90.85,mark=91.03 1615582799923', 'aapl,contract=CALL strikePrice=35.0,bid=90.5,ask=90.65,last=88.25,mark=90.58 1615924799823']
Should have written this data
Result of connection close is: None
0.0

I’m doing this for a school based project and my goal is to perform batch writes, as you can see above, in the line protocol format.I referenced the write API examples, but still have had no luck in getting the above to work.

In my example I’m only sending (2) data points in (1) batch, but this is simply for testing purposes. In the real application, the batch sizes would probably be closer to 1000-2000 per write; (2) writes total, one for CALL and one for PUT.

Any feedback would be highly appreciated.

Still have not been able to resolve the issue, but I’ve been tinkering around and reading. I was able to have data write when I commented out the inclusion of my timestamp field. The timestamp field is in a proper format (unix epoch time) so I am not sure what is wrong, but will continue to work on this.

You can enable the debug option in the python client.

client = InfluxDBClient(url="http://localhost:8086", token=token, org=org, debug=True)

From what do you conclude that the data does not arrive? Is there an error message?

Good afernoon Franky, thank you for the help on turning on the debugger as I will defnitely be using that going forward. For starters the issue here is purely my knowledge of Influx DB as I become more familiar with communicating to the DB.

My first issue, I figured out, was not specifying a timestmap precision. My data was being sent in ‘ms’ and the system defualts to ‘ns’ thus, the data itself was not being written to the database even though the format of the line protocol was proper.

Now my data works:

write_api.write(bucket=bucket, org='-', record=dataInfluxDB, write_precision='ms')

As a control I am still doing a simple write of:

['aapl,contract=PUT strikePrice=30.0,bid=0.0,ask=0.01,last=0.01,mark=0.01 1615582795743', 'aapl,contract=PUT strikePrice=35.0,bid=0.0,ask=0.01,last=0.01,mark=0.0 1615924799542']

['aapl,contract=CALL strikePrice=30.0,bid=90.9,ask=91.15,last=90.85,mark=91.03 1615582799923', 'aapl,contract=CALL strikePrice=35.0,bid=90.5,ask=90.65,last=88.25,mark=90.58 1615924799823']

However, I now believe my issue lies in the batch size and flushing. I manually set my batch_size = 2 as shown here:

write_api = client.write_api(WriteOptions(batch_size=2))

Which generated the two points as expected:

However, the next (2) points in the script never get logged as shown in the terminal output:

I’m trying to read more about this online and figure out what the proper method to approach this is. I’m essentially trying to write data in near “real time”, possibly one second delay, but the amount of elements in my dataInfluxDB list can vary over time.

As such a fixed batchsize might be detrimental and I have to go with a dynamic one? I.E. Read list length, use that for batch, write data, and be ready for next frame of data.

I see only a small part of your Python script.
Are you using the asynchronous version? Could it be that the python script terminates before the scheduler can process the last batch?

Btw, please post your python/log/config snippets in Markdown format here, it is better readable:

```python
put the python code here
```

Franky,

I’ll ensure to always post future code and logs in the markdown format. Your comment on being set as synchronous fixed my issue. Thank you for the feedback as the data stream is working flawlessly now!

1 Like