Lose of Data Uploading lots of data to a database InfluxDB from Redpitaya with Python and seeing with Grafana

Hi!

I’m developing a Degree Investigation Final Project and doing it on Bioengineering (I’m Electronical Engineer). I’m developing a solution in which lots of data have to be transferred from a Redpitaya sensor through Ethernet to an InfluxDB database. It measures the voltage and current passing through two points and measures de the impedance with a Python program, and sends the data of the module and the phase with the tag of the frequency (because it does a frequency swept and calcules the FFT coefficients which are 26 for module and 26 for phase).

I’m seeing the data using Grafana and as I see, lots of data is lost. From my calculus, there should be 953.67 samples per second, and it seems that there are lots of holes if we look into a second in which there are not data. In addition to that, it seems like a frequency because is like you have 200 ms without data, 200 ms with data, 200 ms without, and so on.

I’m using a Python script to make the uploading to the database, and I also save the data on a binary file .dat, but it seems like on the .dat there are all samples but on the database there’s a lose of data.

Let me put a screenshot:

It seems like it stops and starts. Sampling frequency of Redpitaya FPGA is 125 MHz with a decimated of 8: 125M/8 = 15.625 MSamples/s= 15.625 MHz

I both tried sending the data through JSON and through Influx Line Protocol and the problem is the same.

I also attach the piece of code of the uploading of the data (if you want it complete just ask me):

 frame_CH1 = [(np.conj(frame_sorted[-i-1]) + frame_sorted[i])/2 for i in range(26)]
                            frame_CH2 = [1j*(np.conj(frame_sorted[-i-1]) - frame_sorted[i])/2 for i in range(26)]

                            # Impedance calculation
                            z = np.divide(np.array(frame_CH2), 1e-3*np.array(frame_CH1))

                            # The phase must be normalized between (0, 360) degrees
                            z_mag = np.abs(z)
                            # z_phase = np.arctan2(np.imag(z), np.real(z))
                            z_phase = np.mod(np.arctan2(np.imag(z), np.real(z))*180/np.pi + 360, 360)

                            self.queue.put([z_mag, z_phase])
                            
                            
                            # Format data and send to InfluxDB in Influx Line Protocol
                            for i, freq in enumerate(self.frequencies_val):  # self.frequencies_val accesible and contains freq 
                                # Create point in format Line Protocol
                                module_line = f"module,frequency={freq} value={float(z_mag[i])}"
                                phase_line = f"phase,frequency={freq} value={float(z_phase[i])}"
                            
                                # Send clients points to InfluxDB
                                self.client.write([module_line, phase_line], {'db': 'bioimpedance'}, 204, 'line')
                                print("Data sent to InfluxDB for the frequency in charge.")

        # Reset the byte array
        data_frame = bytearray()

        # The new length must take into account the bytes over self.size
        data_length = np.mod(data_length, self.size)

And this is the other test I did with JSON sending, but problem is the same, unique difference is the sending of the data, which you have here:

# Format and send data to InfluxDB
                            for i, freq in enumerate(self.frequencies_val):  # self.frequencies_val accesible and contains frequencies.   
                                points = [
                                    {
                                        "measurement": "module",
                                        "tags": {
                                            "frequency": str(freq)
                                        },
                                        "fields": {
                                            "value": float(z_mag[i])
                                        }
                                    },
                                    {
                                        "measurement": "phase",
                                        "tags": {
                                            "frequency": str(freq)
                                        },
                                        "fields": {
                                            "value": float(z_phase[i])
                                        }
                                    }
                                ]
                                print("Generated matrix for the frequency.")
                                self.client.write_points(points)
                                print("Points writing client for the frequency.")

I would be very grateful if someone of you could help me solve this problem of the non-continuity of the data and this pauses. I think it could be a problem of:

  1. The code, which for example the for is pausing the sending, or other structure which has to be different.

  2. The limit of time or quantity of sending data to InfluxDB.

  3. Something related to the time, for example if samples are too together and no time between them they could be joined or overwritten through the time, but the fact that is it a periodic signal with periodic parts with or without data makes me suspect.

Thank you very much!

Josep Mencion Seguranyes

PD: I also attach a photo of the Redpitaya if it can help you, it is the board I am using in this project, it is a very powerful board for applications like this one: Captura de pantalla 2024-02-23 a las 9.53.40.png - Google Drive