How does mean(value) in continuous queries calculate the mean? Is it a average of the points (sum of points / n-points)?

I have power (watt) data coming into Influx. I chose to only send data when there is a minimum change, so this data coming in is irregularly spaced in time. It’s not coming in at any regular periodic interval, like every 2 seconds. Instead, I might get lots of data points when there are lots of changes, and few data points when there are few changes.

I then wrote a continuous query to average the values every minute, like this:

```
CREATE CONTINUOUS QUERY "cq_watts_minute" ON "telegraf"
BEGIN
SELECT mean("value") AS value
INTO "infinite"."watts_minute"
FROM "autogen"."emon_watts"
group by time(1m), topic
FILL(previous)
END
```

Does taking the mean of the watt values make sense to do if the values aren’t a regular periodic data? The outcome I want is to get the average watts used for that minute. But if the data is irregular/non-regularly-periodic, does the mean function work over time, or does the mean function work over the number of points? The outcome can be quite different depending on data density changes.

For example, say the first 30 seconds of data fluctuates between 45 watts and 55 watts and there are lots of data points, while the next 30 seconds is 0 all the way through so I only get a single 0 data point. The time average should be 25W for that minute. But if you average by number of data points, the n-average would be closer to 50W for that same minute. I want the time average.

If mean() is incorrect to use in this situation, is there another function like mean() that takes into account time?