I am new to the influxdb, several month ago, I create a database A in influxdb with the fileds (time, value) and tags (name, value), here is the measurement:
time id name value
2021-10-01 10:20:30:333 101 test 2.2
2021-10-01 10:20:30:433 101 test1 3.2
2021-10-01 10:20:30:533 101 test2 4.2
now I don’t need the tag id, I planned to delete the id and copy all the data to a new measurement, I tried to export to CSV, but field, maybe due to the count is more than 5000K, then I happened to find the select into clause,
select time value into measurementB from measurementA
but only 200 rows were inserted to the new measurement, then I tried
select time, value into measurementB from measurementA, still not successful,
could anyone tell me how to export all the data successfully, and only keep the tag name, fields time ,value
A plain select statement as the source on an
INSERT INTO will treat everything as fields, therefore losing all tags… without tags you will get way less data as only the last point will be kept for each timestamp.
The export/import via csv is a good idea, why it failed I can’t tell, if you want to move a portion of your data using
SELECT INTO you need to to also use the
GROUP BY in order to keep the tags… something like:
SELECT mean(value) as value INTO _db_._rp_._NewMeasurement_ FROM _db_._rp_._OldMeasurement_ GROUP BY time(1s),name fill(none)
some notes to make it work:
- identify the proper time interval for the grouping, to don’t really aggregate data (ideally you don’t want to aggregate, use an interval so it contains just on point)
- Identify the proper aggregation function mean/last/sum so in case of aggregation the value is still meaningful
- use aliases for the values, otherwise, a name will be given automatically on the basis of the used aggr function (and it won’t be the original but something like “mean” or “value.mean”)
- in the
GROUP BY list all the tags you want to keep
thanks for your feedback.
I tried with your suggestion, and it can really successfully insert the records to new database and keep the tag, it helped me a lot.
but why this sql can only insert 2200 records, and actually there are more than 50W records in the old database, is there any configuration that I can change to move all the records from old database to the new database?
there should be no limit on the rows… my guess is that it has something to do with the grouping, if you run the select query (without the INTO) on the source database you should get the number of rows that will be sent to the destination.
if the process takes forever because you are moving lots of rows, you can filter your queries by time, and move maybe 1 day at a time.
thanks for your patience.
actually I run the command in a software named ‘DBeaver’, which is a database management software, and maybe due to the software restriction, it can only write part of the data.
then I tried the same command as you suggested in the influxdb command line directly, and it works successfully!