Why are shards not removed from the filesystem? And also are still being read into memory on start-up

I modified the default “autogen” policy to set the duration from INFINITY to 180d. These are my retention policies:

> show retention policies
name        duration  shardGroupDuration replicaN default
----        --------  ------------------ -------- -------
autogen     2160h0m0s 168h0m0s           1        true
rp_20_weeks 3360h0m0s 24h0m0s            1        false
rp_1_week   168h0m0s  24h0m0s            1        false

I could see in the logs that autogen retention policy was kicked in and it said it deleted the shards, but in reality I can still see them lying around and being read on start-up.

2019-12-13T08:26:27.029371Z     info    Retention policy deletion check (start) {"log_id": "0JbnFR60000", "service": "retention", "trace_id": "0JgahZLG000", "op_name": "retention_delete_check", "op_event": "start"}
2019-12-13T08:26:27.039845Z     info    Deleted shard group     {"log_id": "0JbnFR60000", "service": "retention", "trace_id": "0JgahZLG000", "op_name": "retention_delete_check", "db_instance": "tasks", "db_shard_group": 32, "db_rp": "autogen"}
2019-12-13T08:26:27.043632Z     info    Deleted shard group     {"log_id": "0JbnFR60000", "service": "retention", "trace_id": "0JgahZLG000", "op_name": "retention_delete_check", "db_instance": "tasks", "db_shard_group": 8, "db_rp": "autogen"}
2019-12-13T08:26:27.047899Z     info    Deleted shard group     {"log_id": "0JbnFR60000", "service": "retention", "trace_id": "0JgahZLG000", "op_name": "retention_delete_check", "db_instance": "tasks", "db_shard_group": 16, "db_rp": "autogen"}
<<SNIP>>

And those shards are still lying around:

$ ls -l /proj/metrics-influx/influxdb/data/tasks/autogen/ | egrep ' (32|8|16)$'
drwxr-sr-x 2 usera group1 4.0K Dec 13 04:21 16
drwxr-sr-x 2 usera group1 4.0K Dec 13 04:22 32
drwxr-sr-x 2 usera group1 4.0K Dec 13 04:21 8

And these are not listed in shard groups also:

> show shard groups
name: shard groups
id  database      retention_policy start_time           end_time             expiry_time
--  --------      ---------------- ----------           --------             -----------
188 _internal     monitor          2019-12-06T00:00:00Z 2019-12-07T00:00:00Z 2019-12-14T00:00:00Z
189 _internal     monitor          2019-12-07T00:00:00Z 2019-12-08T00:00:00Z 2019-12-15T00:00:00Z
190 _internal     monitor          2019-12-08T00:00:00Z 2019-12-09T00:00:00Z 2019-12-16T00:00:00Z
191 _internal     monitor          2019-12-09T00:00:00Z 2019-12-10T00:00:00Z 2019-12-17T00:00:00Z
193 _internal     monitor          2019-12-10T00:00:00Z 2019-12-11T00:00:00Z 2019-12-18T00:00:00Z
196 _internal     monitor          2019-12-11T00:00:00Z 2019-12-12T00:00:00Z 2019-12-19T00:00:00Z
199 _internal     monitor          2019-12-12T00:00:00Z 2019-12-13T00:00:00Z 2019-12-20T00:00:00Z
202 _internal     monitor          2019-12-13T00:00:00Z 2019-12-14T00:00:00Z 2019-12-21T00:00:00Z
88  tasks autogen          2019-09-09T00:00:00Z 2019-09-16T00:00:00Z 2019-12-15T00:00:00Z
96  tasks autogen          2019-09-16T00:00:00Z 2019-09-23T00:00:00Z 2019-12-22T00:00:00Z
104 tasks autogen          2019-09-23T00:00:00Z 2019-09-30T00:00:00Z 2019-12-29T00:00:00Z
112 tasks autogen          2019-09-30T00:00:00Z 2019-10-07T00:00:00Z 2020-01-05T00:00:00Z
120 tasks autogen          2019-10-07T00:00:00Z 2019-10-14T00:00:00Z 2020-01-12T00:00:00Z
128 tasks autogen          2019-10-14T00:00:00Z 2019-10-21T00:00:00Z 2020-01-19T00:00:00Z
136 tasks autogen          2019-10-21T00:00:00Z 2019-10-28T00:00:00Z 2020-01-26T00:00:00Z
144 tasks autogen          2019-10-28T00:00:00Z 2019-11-04T00:00:00Z 2020-02-02T00:00:00Z
152 tasks autogen          2019-11-04T00:00:00Z 2019-11-11T00:00:00Z 2020-02-09T00:00:00Z
160 tasks autogen          2019-11-11T00:00:00Z 2019-11-18T00:00:00Z 2020-02-16T00:00:00Z
168 tasks autogen          2019-11-18T00:00:00Z 2019-11-25T00:00:00Z 2020-02-23T00:00:00Z
176 tasks autogen          2019-11-25T00:00:00Z 2019-12-02T00:00:00Z 2020-03-01T00:00:00Z
184 tasks autogen          2019-12-02T00:00:00Z 2019-12-09T00:00:00Z 2020-03-08T00:00:00Z
192 tasks autogen          2019-12-09T00:00:00Z 2019-12-16T00:00:00Z 2020-03-15T00:00:00Z

But, the supposedly deleted ones are still being read on start-up:

2019-12-13T09:21:29.528115Z     info    Opened file     {"log_id": "0Jgdr2e0000", "engine": "tsm1", "service": "filestore", "path": "/proj/metrics-influx/influxdb/data/tasks/autogen/8/000000015-000000002.tsm", "id": 0, "duration": "3.368ms"}
2019-12-13T09:21:29.528936Z     info    Opened file     {"log_id": "0Jgdr2e0000", "engine": "tsm1", "service": "filestore", "path": "/proj/metrics-influx/influxdb/data/tasks/autogen/64/000000001-000000001.tsm", "id": 0, "duration": "4.153ms"}
2019-12-13T09:21:29.574114Z     info    Opened file     {"log_id": "0Jgdr2e0000", "engine": "tsm1", "service": "filestore", "path": "/proj/metrics-influx/influxdb/data/tasks/autogen/32/000000207-000000003.tsm", "id": 1, "duration": "49.166ms"}

Am I missing anything here?

Thanks in advance,
Rajkiran

Okay, I see that the shard files are now removed from the file-system too. It took almost ~6 hours or so. Is there some config that needs to be tweaked or am I missing something?

Thanks,
Rajkiran