Hello @Anaisdg ,
thanks for the hint!
I eneded up doing the following:
Assuming to have a database test
and a measurement mymeasurement
with the following definition:
# DDL
CREATE DATABASE test
# DML
# CONTEXT-DATABASE: test
mymeasurement,host=myhost value=1.0 1689230446
mymeasurement,host=myhost value=2.0 1689230447
mymeasurement,host=myhost value=2.0 1689230448
mymeasurement,host=myhost value=2.0 1689230449
mymeasurement,host=myhost value=4.0 1689230450
mymeasurement,host=myhost value=5.0 1689230451
mymeasurement,host=myhost,timestamp=1689230452 value=1.0 1689230452
mymeasurement,host=myhost,timestamp=1689230453 value=2.0 1689230453
mymeasurement,host=myhost,timestamp=1689230454 value=3.0 1689230454
mymeasurement,host=myhost,timestamp=1689230455 value=4.0 1689230455
mymeasurement,host=myhost,timestamp=1689230456 value=5.0 1689230456
- Query the data with the
timestamp
tag set and dumping it to a text file. Something like
influx -database test -execute "select * from mymeasurement where timestamp != '' " -format 'csv' > data_with_timestamp.csv
Which resulted in this:
name,time,host,timestamp,value
mymeasurement,1689230452000000000,myhost,1689230452,1
mymeasurement,1689230453000000000,myhost,1689230453,2
mymeasurement,1689230454000000000,myhost,1689230454,3
mymeasurement,1689230455000000000,myhost,1689230455,4
- Drop the series containing a timestamp:
influx -database test -execute "drop series from mymeasurement where timestamp != ''"
- Fix the data in
data_with_timestamp.csv
file by removing the timestamp tag/values and then reimport the data with whatever tool