Set _time with task query into bucket

Hello,
I’m trying to create a task that executes a more/less complex query which calculates the duration between two events from a measurement.
This task should run every minute and write the duration from the before query as _value. This I seem to have covered, but I’m struggling to get a valid _time written to the new target bucket. Essentially the _time field should be the execution timestamp of the task.
I’ve tried to use set() to explicitly set the _time column, but that is not allowed.
Any hints how one can explicitly set the _time stamp when querying into another bucket?

from(bucket: "dtexport")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "cronjob_status_by_multi" and r["site"] == "test")
  |> keep(columns: ["_time","cronjob","status"])
  |> group(columns: ["cronjob"])
  |> sort(columns: ["_time"], desc: false)
  |> top(n:2, columns: ["_time"])
  |> sort(columns: ["_time"], desc: false)
  |> elapsed(unit:1m) // using elapsed to determine if a job ran quick (started and terminated int he same minute)
  |> map(fn: (r) => ({r with _truestatus: status(elapsed: r.elapsed, st: r.status)}))
  |> map(fn: (r) => ({r with executiontime: exectime(elapsed: r.elapsed, st: r._truestatus, time: r._time)}))
  |> filter(fn: (r) => r["_truestatus"] == "RUNNING")
  |> drop(columns: ["elapsed","_truestatus","status"])
  |> set(key: "_measurement", value: "duration")
  |> set(key: "_field", value: "cronjob")
  |> set(key: "_value", value: "executiontime")
  |> to(bucket: "jobexecution")

Thanks!