Limitation of fields within a measurement

In a measurement I have typicaly between 10 and 50 fields.
(A device sending telemetry data like temperature, speed, downtime,…)
After 2 weeks I want to downsample the data to a large aggregate, e.g. 12h. In order to loose not that much information I take the mean, median, min, max, sum, count to the target measurement such that Iam able to calcuclate the e.g. sum over a larger time span…

CREATE CONTINUOUS QUERY …
RESAMPLE EVERY 30m
BEGIN
SELECT mean(), sum(), median(*),… INTO …
END

Is there any limitation in number of fields. Because in the approach above I will have much more fields in the second measurement as in the first one.

Hello @thais,
That shouldn’t be a problem especially if you haven’t changed max-series-per-database setting.
I recommend creating a CQ for each aggregation and automatically downsampling a database with backreferencing. Use a function with a wildcard ( * ) and INTO query’s backreferencing syntax to automatically downsample data from all measurements and numerical fields in a database.

CREATE CONTINUOUS QUERY "cq_basic_br" ON "transportation"
BEGIN
  SELECT mean(*) INTO "downsampled_transportation"."autogen".:MEASUREMENT FROM /.*/ GROUP BY time(30m),*
END

as described here.

Unless, is there some reason why you need each downsampled series to exist in the same measurement?

Thanks for your answer @Anaisdg !

Why is that better?

No, the only reason is that I have one measurement per device which is easier to handle for me.

Are there actually any performance losses when having many fields?