Creating a task that transfers data from a django endpoint to an influxdb bucket

Hi,
I am currently using influx db and I have created a task that runs every 20 minutes, takes data from a bucket and transfers it to another. Here’s the script:

option task = {name: “Test”, every: 20m, offset: 0m}
from(bucket: “LowerFrequencyDashboard”)
|> range(start: -task.every)
|> filter(fn: (r) => r[“_measurement”] == “powerTable”)
|> aggregateWindow(every: task.every, fn: mean)
|> to(bucket: “NewDash”, org: “Vadimus”

I am now trying to fetch a value from a django api endpoint, this would replace the “from” I guess.
After fetching this data, I would like to transfer it to my bucket with the “to” function. How can I fetch data from a django api endpoint in the task script ?
Thanks a lot

Hello @KMansourr,
Welcome!
That’s so cool to see you using tasks as a new user.
I would use the http.get() function.

QQ what is format of the data that you’re fetchign from the django api endpoint?
If it’s JSON you might want to look at the following post as well:

Please let me know if that helps!

Also what are you using InfluxDB for? I love to hear about what the community is doing with Influx and what your experience has been like so far. Thanks :slight_smile:

this is the script that I made for the get request, it is not working so far but I think I’m not too far. Am I using the right library “experimental/http” ?
Let me know if you see anything wrong in the code.

import "influxdata/influxdb/sample"
import "experimental/json"
import "experimental/http"
import "array"

option task = {name: "Test", every: 20m, offset: 0m}

response =
    http.get(
        url: "http://127.0.0.1:8000/projects/1/dashboard/cron/",
        headers:
            {
                Authorization:
                    "JWT eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ0b2tlbl90eXBlIjoiYWNjZXNzIiwiZXhwIjoxNjU2NTMyMDY0LCJpYXQiOjE2NTY1MzE3NjQsImp0aSI6IjY4OWUwZGZjYzExNjRlMDE4NGUxNzQyOWE2OTQ3ZTVmIiwidXNlcm5hbWUiOjF9.pU5HQ5lRfJF-F6saKEMofVg238VxaYFspmAcsPlX4_o",
            },
    )
responseBody = response.body
seedRow = [{_time: now()}]

array.from(rows: seedRow)
    |> map(
        fn: (r) => {
            body = json.parse(data: responseBody)

            return {r with _measurement: "powerTable", _field: "value", _value: body.message, name: body.name}
        },
    )
    |> to(org: "Vadimus", bucket: "NewDash")

I will try what you sent about the JSON Objects
To answer your question, we are using influx data to store energy/power consumption from different buildings and show them in the dashboard of a client app that the client will have access to. We needed a good timeseries database since it is the data we receive from these buildings’ energy inverters. The influxdb task would fetch the recent data from the inverters and transfer it to the rightful bucket. The experience has been great so far! The ui is very user friendly.
I have one more question about queries. I have been testing to see what is considered a single query exactly since we might use influxdb cloud and the query count is important if we do so. so far with my tests any query I make from python code is counted as a single query, is it like this for any query size (example can my query return 10000 flux values and still be considered 1 query in the count).
Thank you for your help !

@KMansourr,
Thanks for all the info.
Can you please share your errors as well as your responseBody or JSON?
Thank you!

I get this error


I’m not too far but at least I confirmed I’m using the right package experimental/http.
Could you please also tell me if you know the answer to my previous question ? :

Thank you !

Hello @KMansourr,
I can’t help you parse the JSON without you sharing the JSON with me. Can you please share it?

Yes you are correct. The size of the response doesn’t influence the query count, only the data out.

I fixed the problem, here’s the fix i had to do in the http.get:


Since influxdb was in a docker container, I had to use host.docker.internal in the url instead of localhost, and it worked for me. It would’ve been easier if I could print/console.log inside the script but I managed!

1 Like

@KMansourr thanks for sharing your solution and working through it! I’m happy it’s working for you now!