[SOLVED]Kapacitor task is not writing all data back to influx

Hello,
I’ve created a kapacitor task that queries data from influxdb, shift it’s time stamp and write it back to influx.
the task seems to work fine, with no errors, saing that x point were writen to influxdb, but when i query the influxdb, i cant see most of the points were not really written, how can i debug this, no error massage is showing up …

this is my task ‘show’:

[ec2-user@ip-10-108-19-159 ~]$ sudo kapacitor show db_generator2
ID: db_generator2
Error:
Template:
Type: batch
Status: enabled
Executing: true
Created: 17 May 18 08:34 UTC
Modified: 22 May 18 11:39 UTC
LastEnabled: 22 May 18 11:39 UTC
Databases Retention Policies: [“Air_Data”.“autogen”]
TICKscript:
var data = batch
|query(‘’’ SELECT * FROM “Air_Data”.“autogen”.“volumetric_V1” ‘’')
.period(1m)
.every(10s)
.offset(155d)

data
|shift(155d)
|influxDBOut()
.database(‘Airtel_Data’)
.retentionPolicy(‘autogen’)
.measurement(‘volumetric_V1’)
.precision(‘s’)

DOT:
digraph db_generator2 {
graph [throughput=“0.00 batches/s”];

query1 [avg_exec_time_ns=“104.034097ms” batches_queried=“12” errors=“0” points_queried=“19176” working_cardinality=“0” ];
query1 → shift2 [processed=“12”];

shift2 [avg_exec_time_ns=“1.114µs” errors=“0” working_cardinality=“0” ];
shift2 → influxdb_out3 [processed=“12”];

influxdb_out3 [avg_exec_time_ns=“958ns” errors=“0” points_written=“19176” working_cardinality=“0” write_errors=“0” ];
}

and this is the influxdb query, instead of showing thousands of results i only get a few…

> select count(*) from volumetric_V1 where time > now() -10m
name: volumetric_V1
time                count_group count_historical_mean count_historical_stddev count_host count_level1 count_level2 count_level3 count_level4 count_level5 count_level6 count_metric count_monitor count_status count_version
----                ----------- --------------------- ----------------------- ---------- ------------ ------------ ------------ ------------ ------------ ------------ ------------ ------------- ------------ -------------
1526989113481080948 3           3                     3                       3          3            3            3            3            3            3            3            3             3            3
>

what am i doing wrong ?
this is getting me crazy :slight_smile:

had .groupBy(*) missing :slight_smile:

1 Like

Hey @NinjaDude. I am facing a similar issue where I am not able to write a batch of records obtained from influxdb back to it.

dbrp "dummydb"."autogen"

var chunk = batch
                |query('SELECT * FROM "dummydb"."autogen".new_data')
                .period(1d)
                .every(5m)
                |influxDBOut()
                .database('dummydb')
                .retentionPolicy('daily')
                .measurement('out_data')

DOT:
digraph task1 {
graph [throughput=“0.00 batches/s”];

query1 [avg_exec_time_ns=“0s” batches_queried=“0” errors=“0” points_queried=“0” working_cardinality=“0” ];
}

When i enable the batch process, I do not see an ‘outdata’ measurement created and batches queried is displayed as 0. I want to ingest this batch of records into a python script but unable to do so. It would be awesome if you can point out the mistake in my script.