How to create a query for forecast.solar in inputs of telegraf

I want to pass the data from forecast.solar to influxdb via telegraf. I created an inputs.http module for this. Unfortunately, it doesn’t work that way. Has anyone dealt with this and realized a working transfer?

[[inputs.http]]
  urls = [
    "https://api.forecast.solar/estimate/watts/51.15/10.45/35/0/1?time=utc"
  ]
  
  headers = {"accept" = "text/csv", "X-Delimiter" = "|", "X-Separator" = ";"}
  
  data_format = "csv"
  name_override = "pvForecastWattHours"
  interval = "3600s"
  csv_header_row_count = 0
  csv_column_names = ["time","value"]
  csv_delimiter = ";"
  csv_timestamp_column = "time"
  csv_timestamp_format = "2006-01-02T15:04:05-07:00"

In debug mode I could find an error message

status code 429

. What does that mean and what would be the solution?

[inputs.http] Error in plugin: [url=https://api.forecast.solar/estimate/watts/51.15/10.45/35/0/1?time=utc]: received status code 429 (Too Many Requests), expected any value out of [200]

That response code is from the API itself. It means you have sent too many requests to the API and you are no longer going to get responses for a little bit.

You should see how many requests they allow per hour, then set the interval on that input to something that will spread out the requests over an hour or so.

From the above doc, the public limit is 12 calls per IP per hour. So setting the interval to once every 5 minutes or higher would be best.

Hi @jpowers
yes you were right At least the problem

is fixed.
However, I don’t understand why this simple csv query is not parsed?
I just don’t get any values.
The curl query yields the following:

"2023-01-30 14:00:00";4236
"2023-01-30 15:00:00";4433

And the inputs.http section should be relatively simple:

[[inputs.http]]
  urls = [
    "https://api.forecast.solar/estimate/watts/51.15/10.45/35/0/1"
  ]
  headers = {"accept" = "text/csv"}
  interval = "3600s"
  data_format = "csv"
  csv_header_row_count = 0
  csv_column_names = ["time","value"]
  csv_delimiter = ";"

But something still seems to be missing or wrong?

Without a log, error message, or metric I am only left guessing as to where you are not seeing values and why :slight_smile:

You can always test this using a file plugin:

[[inputs.file]]
  files = ["data.csv"]
  data_format = "csv"

  csv_delimiter = ";"
  csv_header_row_count = 0
  csv_column_names = ["time","value"]
  csv_timestamp_column = "time"
  csv_timestamp_format = "2006-01-02 15:04:05"

[[outputs.file]]

file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000

The result of my test showed that the timestamp output does not correspond to the Unix Epoch time. So something in my code is wrong.

[[inputs.file]]
  files = ["/tmp/sample.csv"]
  interval = "60s"
  data_format = "csv"
  csv_header_row_count = 0
  csv_column_names = ["time", "value"]
  csv_delimiter = ";"
  csv_timestamp_format = "2023-01-31 08:03:00"

[[outputs.file]]

This is the source data of the xml file:

"2023-01-20 08:03:00";0
"2023-01-20 09:00:00";37
"2023-01-20 10:00:00";71

and this is the influx output:

file,host=telegraf_dev time="2023-01-20 08:03:00",value=0i 1675160340000000000
file,host=telegraf_dev time="2023-01-20 09:00:00",value=37i 1675160340000000000
file,host=telegraf_dev time="2023-01-20 10:00:00",value=71i 1675160340000000000

It looks like the time column hasn’t been converted.
What mistake am I making here?

This field takes data in the format of Go standard time. Please use the example found in my previous post.

Exactly, I updated that to

and the error message is:

[inputs.file] Error in plugin: could not parse "/tmp/sample.csv": parsing time "2023-01-31" as "2006-01-02 15:04:05": cannot parse "" as "15"

This means your timestamp that was received is actually “2023-01-31”, and does not contain any hours/minutes/seconds.

You would need to update your timestamp format to remove that information (e.g. 2006-01-02)

For inputs.file I use an xml file with the following content:

"2023-01-31 07:51:00";0
"2023-01-31 08:00:00";32
"2023-01-31 09:00:00";171

There is a space between the day and the hour. Maybe that’s the problem?
In any case, the data would also come in from forecast.solar in the same way.
I only use the xml file for debugging.

That field is a string so it should all be read in at once. My post with an example should be copy/pasteable.

Can you share your full config again?

Of course, here the config:

[[inputs.file]]
  files = ["/tmp/sample.csv"]
  data_format = "csv"
  csv_header_row_count = 0
  csv_column_names = ["time", "value"]
  csv_delimiter = ";"
  csv_timestamp_column = "time"
  csv_timestamp_format = "2006-01-02 15:04:05"

[[outputs.file]]
  files = ["/tmp/metrics.out"]
  data_format = "influx"

And the xml file as above

What version of telegraf are you running?
Can you provide the full logs please?

config.toml:

[[inputs.file]]
  files = ["/tmp/sample.csv"]
  data_format = "csv"
  csv_header_row_count = 0
  csv_column_names = ["time", "value"]
  csv_delimiter = ";"
  csv_timestamp_column = "time"
  csv_timestamp_format = "2006-01-02 15:04:05"

[[outputs.file]]
  files = ["/tmp/metrics.out"]
  data_format = "influx"

/tmp/sample.csv:

"2023-01-30 14:00:00";4236
"2023-01-30 15:00:00";4433

/tmp/metrics.out:

file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000
file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000
file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000
file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000
file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000
file,host=ryzen value=4236i 1675087200000000000
file,host=ryzen value=4433i 1675090800000000000
❯ ../telegraf-builds/telegraf-v1.25.1 --config config.toml --debug
2023-01-31T16:26:32Z I! Starting Telegraf 1.25.1
2023-01-31T16:26:32Z I! Available plugins: 228 inputs, 9 aggregators, 26 processors, 21 parsers, 57 outputs, 2 secret-stores
2023-01-31T16:26:32Z I! Loaded inputs: file
2023-01-31T16:26:32Z I! Loaded aggregators: 
2023-01-31T16:26:32Z I! Loaded processors: 
2023-01-31T16:26:32Z I! Loaded secretstores: 
2023-01-31T16:26:32Z I! Loaded outputs: file
2023-01-31T16:26:32Z I! Tags enabled: host=ryzen
2023-01-31T16:26:32Z I! [agent] Config: Interval:10s, Quiet:false, Hostname:"ryzen", Flush Interval:10s
2023-01-31T16:26:32Z D! [agent] Initializing plugins
2023-01-31T16:26:32Z D! [agent] Connecting outputs
2023-01-31T16:26:32Z D! [agent] Attempting connection to [outputs.file]
2023-01-31T16:26:32Z D! [agent] Successfully connected to outputs.file
2023-01-31T16:26:32Z D! [agent] Starting service inputs
2023-01-31T16:26:42Z D! [outputs.file] Wrote batch of 2 metrics in 48.69µs
2023-01-31T16:26:42Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T16:26:52Z D! [outputs.file] Wrote batch of 2 metrics in 28.3µs
2023-01-31T16:26:52Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T16:27:02Z D! [outputs.file] Wrote batch of 2 metrics in 37.52µs
2023-01-31T16:27:02Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T16:27:12Z D! [outputs.file] Wrote batch of 2 metrics in 30.281µs
2023-01-31T16:27:12Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T16:27:22Z D! [outputs.file] Wrote batch of 2 metrics in 28.93µs
2023-01-31T16:27:22Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T16:27:32Z D! [outputs.file] Wrote batch of 2 metrics in 28.91µs
2023-01-31T16:27:32Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
^C2023-01-31T16:27:34Z D! [agent] Stopping service inputs
2023-01-31T16:27:34Z D! [agent] Input channel closed
2023-01-31T16:27:34Z I! [agent] Hang on, flushing any cached metrics before shutdown
2023-01-31T16:27:34Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics

Now it works as expected. I reset everything again, rewrote the config file and filled the sample.csv again. With the above constellation, the influx data comes as intended. I don’t have an explanation why it didn’t work before. The log file looks like this:

2023-01-31T17:14:16Z I! Starting Telegraf 1.23.2
2023-01-31T17:14:16Z I! Loaded inputs: file
2023-01-31T17:14:16Z I! Loaded aggregators: 
2023-01-31T17:14:16Z I! Loaded processors: 
2023-01-31T17:14:16Z I! Loaded outputs: file
2023-01-31T17:14:16Z I! Tags enabled: host=telegraf_dev
2023-01-31T17:14:16Z I! [agent] Config: Interval:30s, Quiet:false, Hostname:"telegraf_dev", Flush Interval:10s
2023-01-31T17:14:16Z D! [agent] Initializing plugins
2023-01-31T17:14:16Z D! [agent] Connecting outputs
2023-01-31T17:14:16Z D! [agent] Attempting connection to [outputs.file]
2023-01-31T17:14:16Z D! [agent] Successfully connected to outputs.file
2023-01-31T17:14:16Z D! [agent] Starting service inputs
2023-01-31T17:14:26Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T17:14:36Z D! [outputs.file] Wrote batch of 22 metrics in 401.12µs
2023-01-31T17:14:36Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics

In the meantime I was able to reproduce the problem with a different constellation. The forecast.solar API can also provide the data with associated tags. If I fill the sample.csv with these values, the configuration for input.file has to be adjusted. I did that once and the error occurs again.

config.toml:

[[inputs.file]]
  files = ["/tmp/sample.csv"]
  data_format = "csv"
  csv_header_row_count = 0
  csv_column_names = ["type", "time", "value"]
  csv_delimiter = ";"
  csv_tag_columns = ["type"]
  csv_timestamp_column = "time"
  csv_timestamp_format = "2006-01-02 15:04:05"

[[outputs.file]]
  ## Files to write to, "stdout" is a specially handled file.
  files = ["/tmp/metrics.log"]
  data_format = "influx"

/tmp/sample.csv:

watts;"2023-01-31 07:51:00";0
watts;"2023-01-31 08:00:00";32
watts;"2023-01-31 09:00:00";171

With tag ‘watts’

and here is the associated log with the problem “Error in plugin”:

2023-01-31T17:33:02Z I! Starting Telegraf 1.23.2
2023-01-31T17:33:02Z I! Loaded inputs: file
2023-01-31T17:33:02Z I! Loaded aggregators: 
2023-01-31T17:33:02Z I! Loaded processors: 
2023-01-31T17:33:02Z I! Loaded outputs: file
2023-01-31T17:33:02Z I! Tags enabled: host=telegraf_dev
2023-01-31T17:33:02Z I! [agent] Config: Interval:30s, Quiet:false, Hostname:"telegraf_dev", Flush Interval:10s
2023-01-31T17:33:02Z D! [agent] Initializing plugins
2023-01-31T17:33:02Z D! [agent] Connecting outputs
2023-01-31T17:33:02Z D! [agent] Attempting connection to [outputs.file]
2023-01-31T17:33:02Z D! [agent] Successfully connected to outputs.file
2023-01-31T17:33:02Z D! [agent] Starting service inputs
2023-01-31T17:33:12Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T17:33:22Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics
2023-01-31T17:33:30Z E! [inputs.file] Error in plugin: could not parse "/tmp/sample.csv": parsing time "2023-01-31" as "2006-01-02 15:04:05": cannot parse "" as "15"
2023-01-31T17:33:32Z D! [outputs.file] Buffer fullness: 0 / 10000 metrics

Isn’t that interesting?

I found the time parsing problem!
I received the data via Postman and copied it into the csv file. In this list were two data series that were not parso go formatted. That’s why this error message came up.
Now I let the API send me the data in time=nanoseconds. Then they are preformatted and everything works as desired.
I will post the complete code for the API forecast.solar here soon.

Great to hear, thanks for following up. Would be happy to see that example for future uses.

The goal was to visualize the data from forecast.solar via Telegraf in InfluxDB and Grafana.
The forecast.solar API exposes four tags. If you want to load all four tags at once, it is important to use the correct syntax.
The function of the API stream and the correct home location can be checked, e.g.:

https://api.forecast.solar/check/51.15/10.45

The content of the API stream is of course only an example and must be adjusted according to the doc on forecast.solar.
The code for the inputs.http module in Telegraf then looks like this:

[[inputs.http]]
  urls = [
    "https://api.forecast.solar/estimate/51.15/10.45/30/0/5?time=nanoseconds"
  ]
  timeout = "5s"
  headers = {"accept" = "text/csv"}
  name_override = "forecast_solar"
  interval = "3600s"
  data_format = "csv"
  tagexclude = ["host"]
  csv_header_row_count = 0
  csv_column_names = ["type", "time", "value"]
  csv_delimiter = ";"
  csv_tag_columns = ["type"]
  csv_timestamp_column = "time"
  csv_timestamp_format = "unix_ns"
1 Like

Hello,

I came accross this discussion while trying to import Forecast.solar data into Prometheus using Telegraf. I managed to get the data into telegraf and can access the data in prometheus as well, but it is not clear to me how to use the data for visualisation in Grafana. Do you have a dashboard based on Prometheus datasource that you or someone else can share?

Thanks in advance and kind regards

Jens