turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-648784763
> @turbaszek In case of downscale or pod dying you'd want to check if the
job is still running, hence the need of having a job id derived from the task
name and execution date.
turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-642921263
@albertocalderari I think I maybe missing something. What do you understand
by deterministic job_id?
This is an
turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-642029283
@albertocalderari should we consider this issue as resolved?
This is an automated message from the Apache Git
turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-642029064
> So my personal opinion is stick with the dict :)
Same is mine. And what is more... dict is JSON serializable so it can be
used as template field!
turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-634816078
Summoning @edejong to hear his opinion :)
This is an automated message from the Apache Git Service.
To respond to
turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-630783205
> My idea was yo use the methods exposed from google.cloud.bigquery.Client
along with google.cloud.bigquery.{JobType}JobConfig rather than using
dictionaries.
I see your
turbaszek commented on issue #8903:
URL: https://github.com/apache/airflow/issues/8903#issuecomment-630443552
Hi @albertocalderari, I'm curently workining on refactor of BQ integration.
I decided to abandon custom "run" and use `insert_job` method which will accept
`job_id`: