Downsampling hasn't reduced storage size

In our company, we are using the InfluxDB for storing the accessibility percentage for devices. the resolution of the data is every 10 mins. Since we are collecting the data for over a year now. Considering more than 100k devices, it’s natural that the storage requirement has become a major concern. That’s when I came across downsampling and tried in one of our test servers.

The added attachment shows the data structure for the data stored in our DB bucket (ppc-data) .

Below query is what I have run to populate the data-downsample bucket.

from(bucket: "ppc-data")
  |> range(start: -90d)
  |> filter(fn:(r) => r._measurement == "traps")
  |> filter(fn: (r) => r._field == "reachability_percentage")
  |> filter(fn: (r) => r["device_id"] == "XX:XX:XX:XX:XX:XX")
  |> aggregateWindow(every: 1h, fn: mean, createEmpty: false)
  |> to(bucket: "data-downsample")

The issue what I see here is that the source data in the bucket (ppc-data) that has 643064 data points comes upto 9 MB. This data spans for about 10 months.

The downsampled data with 2141 data points comes upto 7.2 MB. Downsampled data is a mean value of reachability percentage of resolution 1 hour for 90 days on one Device.

I can’t quite comprehend what exactly am I doing wrong here. I was expecting downsampled data being less in number of records and of less resolution should be occupying way less storage.

Can somebody please help me understand the issue ?