Dynamic calculation or value creation based on a provided value and a time range in hours per day

Hello,

I’m trying to determine if it’s possible to calculate the cost of kilowatt-hours consumed during off-peak hours (€0.1672/KWh) and peak hours (€0.2285/KWh) based on the time period (for example, peak hours from 07:30 to 23:30 with a rate of €0.2285/KWh), and the same for off-peak hours. Currently, watt-hours are recorded in the InfluxDB database without being calculated based on hours, which prevents me from defining two separate calculations in Grafana. I would like to know what would be the best way to achieve this, please.

I had opened a topic on the Grafana community, but it seems that Grafana cannot perform this calculation, to the best of my knowledge: https://community.grafana.com/t/calculation-of-consumption-over-a-period-and-then-converting-it-into-currency/103411

Therefore, I’m wondering whether it would be better to work on the InfluxDB side or if I should make modifications on the Python side. I’m not quite sure how I could do this in the simplest and least risky way in terms of future server updates.

I hope I’ve provided enough details for you to assist me.

Thank you very much for your help.

Grafana version and operating system used :
Grafana v10.1.1 (0cfa76b22d) on Ubuntu 18.04

The database I’m using is InfluxDB :
InfluxDB v1.7.6 (git: 1.7 01c8dd416270f424ab0c40f9291e269ac6921964)

Chronograf Version: 1.7.11
Grafana v10.1.1 (0cfa76b22d)

I plan to upgrade InfluxDB to version 2 or 3, so if it’s necessary to do so, please let me know.

My current Python function in the script is:

def get_args():
    #Parse command line arguments.
    parser = argparse.ArgumentParser(description='Read data from PZEM-004t and send it to InfluxDB.')
    parser.add_argument('-db', '--database', type=str, help='Database name', required=True)
    parser.add_argument('-sn', '--session', type=str, help='Session', required=True)
    parser.add_argument('-rn', '--run', type=str, help='Hostname or Number', required=False, default=datetime.datetime.now().strftime("%Y%m%d%H%M"))
    args = parser.parse_args()
    dbname = args.database
    runNo = args.run
    session = args.session
    return dbname, session, runNo

def get_sensor_data():
    #Read data from PZEM-004t sensor.

## CODE PYTHON ##

        master = modbus_rtu.RtuMaster(serial_port)
        master.set_timeout(2.0)
        master.set_verbose(True)

        data = master.execute(1, cst.READ_INPUT_REGISTERS, 0, 10)
        voltage = data[0] / 10.0
        current = (data[1] + (data[2] << 16)) / 1000.0
        power = (data[3] + (data[4] << 16)) / 10.0
        energy = data[5] + (data[6] << 16)
        frequency = data[7] / 10.0
        powerFactor = data[8] / 100.0
        alarm = data[9]

        # Changing power alarm value to 100 W
		# master.execute(1, cst.WRITE_SINGLE_REGISTER, 1, output_value=100)

        master.close()
        serial_port.close()

        return {
            "Voltage [V]": voltage,
            "Current [A]": current,
            "Power [W]": power,
            "Energy [Wh]": energy,
            "Frequency [Hz]": frequency,
            "Power factor []": powerFactor,
            "Alarm": alarm
        }

    except Exception as e:
        print(f"An error occurred: {str(e)}")
        logging.error(f"An error occurred: {str(e)}")
        return None

def send_data_to_influxdb(dbname, runNo, data):
    #Send data to InfluxDB.
    if data is not None:
        now = datetime.datetime.utcnow().isoformat()
        datapoints = [
            {
                "measurement": "watts",
                "tags": {
                    "host": runNo,
                },
                "time": now,
                "fields": data
            }
        ]

        client = InfluxDBClient(HOST, PORT, USER, PASSWORD, dbname)
        try:
            result = client.write_points(datapoints)
            if result:
                logging.info("Data sent to InfluxDB successfully.")
                print("Data sent to InfluxDB successfully.")
            else:
                logging.error("Failed to send data to InfluxDB.")
                print("Failed to send data to InfluxDB.")
        except Exception as e:
            logging.error(f"An error occurred while sending data to InfluxDB: {str(e)}")
            print(f"An error occurred while sending data to InfluxDB: {str(e)}")

if __name__ == "__main__":
    dbname, session, runNo = get_args()
    try:
        while True:
            sensor_data = get_sensor_data()
            send_data_to_influxdb(dbname, runNo, sensor_data)
            time.sleep(SAMPLING_PERIOD)
    except KeyboardInterrupt:
        logging.error("Program stopped by keyboard interrupt [CTRL_C] by user.")
        print("Program stopped by keyboard interrupt [CTRL_C] by user.")
    finally:
        logging.shutdown()

Does anyone have any ideas for me, please ?

Hello @vincent1890,
Sorry for the delay and thanks for bumping this.
I would use python yes.
Is your python script not working as expected?
What are your concerns with it specifically.

No I wouldn’t upgrade to 2.x. InfluxDB v3 has a lot of benefits with regards to performance and interoperability, however the OSS and community editions aren’t available yet. v1 and v3 have more in common, so should you decide to upgrade you’re better off staying in v1.

1 Like

Hello @Anaisdg

Thank you for your response I thought no one would be able to try to help me.

Alright, thank you for your reply. I will wait for version 3 it will be better.

My Python script is working well. I’ve used it to explain how and what data is collected and how it is sent to the InfluxDB server. Currently, I don’t know how I could calculate the consumed Wh only during the specified hours and how I could store them correctly and display them in Grafana, which is connected to InfluxDB.

Thank for help

Not sure how much will that help but…
You could use this to get an average or sum in 30min buckets over requested time span and then multiply results.
Or at least you could get grouped data for your python script.

If you make it work it would be awesome if you could share the flux query :slight_smile:

1 Like