Using SeriesHelper with 'batch_write'

So I’m reading from a raspberry pi pins and sending the data over the network to influx db. Yes, batching is something I’m using, but the batch blocks the main python program when its sending the data / POST to the db for which I’m using some async lib for python 2… The Series helper class does provide this batch writing, but how am i suppose to wrap this up in a try catch block? Since the wifi is very unstable and very occasionally the device loses internet connection. Any recommendations?

Another possible idea was to tun auto commit option to False and have a separate counter when SeriesHelper is called let’s say 500th time. Does that make sense? Where exactly I put the try except block?
Currently I have this simple code:

class MySeriesHelper(SeriesHelper):
    class Meta:
        client = myclient
        # The series name must be a string aka measurement
        series_name = 'events.stats.{server_name}'
        # Defines all the fields in this time series.
        fields = ['pin0', 'pin1', 'pin2', 'pin3', 'pin4', 'pin5', 'pin6', 'pin7']
        # Defines all the tags for the series.
        tags = ['server_name']
        # Defines the number of data points to store prior to writing on the wire.
        bulk_size = 500
        print(id)
        id += 1
        autocommit = True

while True:
    ch = ads1256.read_all_channels()
    for x in range(0, 8):
        chv[x] = (((ch[x] * 100) /167.0)/int(gain))/1000000.0

    MySeriesHelper(server_name=cpu_serial,
                    pin0=chv[0],  pin1=chv[1], pin2=chv[2],\
                    pin3=chv[3], pin4=chv[4], pin5=chv[5], pin6=chv[6], pin7=chv[7])

    time.sleep(interval)

Bumping sorry :kissing_smiling_eyes: