How to reset the cumulative sum based on condition?

Hi,

How to reset the cumulative sum based on condition?

I am trying to calculate my stocks daily position with ticker average price. I have a measurement with my transactions. Exemple:

#constant measurement,stocks_transactions
#datatype dateTime:2006-01-02,tag,tag,tag,tag,tag,long,double,double
date,Exchange,Broker,Ticker,Stock_Sector,Operation_Type,Number_Units,Unit_Price,Operation_Value
2022-05-17,B3,AA,TICKER1,Sector_1,Buy,100,18.84,1884
2022-05-11,B3,AA,TICKER1,Sector_1,Sell,-200,16.00,-3200
2022-05-10,B3,AA,TICKER1,Sector_1,Buy,100,13.79,1379
2022-05-09,B3,AA,TICKER1,Sector_1,Buy,100,15.70,1570

I am using the following query:

from(bucket: "stocks")
  |> range(start: -1y)
  |> filter(fn: (r) => r._measurement == "stocks_transactions")
  |> filter(fn: (r) => r.Ticker == "TICKER1")
  |> filter(fn: (r) => r._field == "Number_Units" or r._field == "Operation_Value")
  |> keep(columns: ["_field", "_value", "_start", "_stop", "_time", "Ticker"])
  |> sort(columns: ["_time"])
  |> cumulativeSum()
  |> pivot(rowKey: ["_time"], columnKey: ["_field"], valueColumn: "_value")
  |> map(fn: (r) => ({r with _value: float(v: r.Operation_Value) / float(v: r.Number_Units)}))

But since, on day 2022-05-11 I sold all my Ticker 1 position, the average price is “wrong” using cumulativeSum.

0       15.7    2021-06-06T10:05:57.701Z        2022-06-06T16:05:57.701Z        2022-05-09T00:00:00.000Z        100     1570    TICKER1
0       14.745  2021-06-06T10:05:57.701Z        2022-06-06T16:05:57.701Z        2022-05-10T00:00:00.000Z        200     2949    TICKER1
0       NaN         2021-06-06T10:05:57.701Z    2022-06-06T16:05:57.701Z        2022-05-11T00:00:00.000Z        0       -251    TICKER1
0       16.33   2021-06-06T10:05:57.701Z        2022-06-06T16:05:57.701Z        2022-05-17T00:00:00.000Z        100     1633    TICKER1

If I have sold all my TICKER1 position, I would like to have the cumulativeSum to start again from zero in the next Buy. Then,
my average price would be 18.84 and the Operation_Value would be 1884

I also tried using this approach.

from(bucket: "stocks")
  |> range(start: -1y)
  |> filter(fn: (r) => r._measurement == "stocks_transactions")
  |> filter(fn: (r) => r.Ticker == "TICKER1")
  |> filter(fn: (r) => r._field == "Number_Units" or r._field == "Operation_Value")
  |> keep(columns: ["_field", "_value", "_start", "_stop", "_time", "Ticker"])
  |> sort(columns: ["_time"])
  |> cumulativeSum()
  |> aggregateWindow(every: 1d, fn: mean, timeSrc: "_start")
  |> fill(usePrevious: true)
  |> filter(fn: (r) => exists r._value)
  |> pivot(rowKey: ["_time"], columnKey: ["_field"], valueColumn: "_value")
  |> map(fn: (r) => ({r with _value: float(v: r.Operation_Value) / float(v: r.Number_Units)}))
0       15.7    2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-09T00:00:00.000Z        100     1570    TICKER1
0       14.745  2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-10T00:00:00.000Z        200     2949    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-11T00:00:00.000Z        0       -251    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-12T00:00:00.000Z        0       -251    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-13T00:00:00.000Z        0       -251    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-14T00:00:00.000Z        0       -251    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-15T00:00:00.000Z        0       -251    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-16T00:00:00.000Z        0       -251    TICKER1
0       27.33   2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-17T00:00:00.000Z        100     2733    TICKER1

But then, I would need to keep only 1 row when I have 0 holdings. The output I would like:

0       15.7    2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-09T00:00:00.000Z        100     1570    TICKER1
0       14.745  2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-10T00:00:00.000Z        200     2949    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-11T00:00:00.000Z            0   -251    TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-12T00:00:00.000Z        0       0       TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-13T00:00:00.000Z        0       0       TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-14T00:00:00.000Z        0       0       TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-15T00:00:00.000Z        0       0       TICKER1
0       NaN     2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-16T00:00:00.000Z        0       0       TICKER1
0       18.84   2021-06-06T08:32:06.770Z        2022-06-06T14:32:06.770Z        2022-05-17T00:00:00.000Z        100     1884    TICKER1

Any ideas?

Thank you

Hello @fluxuser123,
Unfortunately, I don’t think there’s a way to do this with Flux at the moment, but the Flux team is working on adding functionality to support this.

Are you trying to do this on each individual position you take, and by lot or by ticker, or are you trying to have it across the portfolio?

If you are trying to do it by lot, you need to create a separate value for a lot number/key and do your cumulativeSum() based on the lot key. It gets messy though if you sell part of your lot in one transaction-- you need to re-key the lot IDs somehow.

Hello Anaisdg,

Thank you for the information. It will be a good feature. There are many use cases.

Thank you

Hi Patrick,

I would like to have cumulativeSum() by ticker and reset the cumulative sum each time the ticker’s Number_Units is zero.

Thanks

Sounds like you need to map the sum to a new field with a reset rather than use the built-in function.