boraberke commented on PR #38868: URL: https://github.com/apache/airflow/pull/38868#issuecomment-2114446925
Hi @josh-fell, thank you for your feedback! > I'm trying to understand the what happens if a user sets `retry_from_failure=True` on the operator and provides either `steps_override`, `schema_override`, or `additional_run_config` initially and the task is naturally retried in Airflow. It seems like with the most recent changes, the task would fail because those args were supplied originally once `retry_from_failure()` is called in the DbtCloudHook. Can you clarify that for me? Yes, it is correct. > Maybe to alleviate both scenarios, when `retry_from_failure=True`, the `trigger_job_run()` method actually retrieves the job's status from dbt Cloud, assesses whether or not to call the retry endpoint based on success/failure? This would completely remove using Airflow internals to control how the job triggering behaves. I agree that an additional check of the previously run jobs and deciding upon the state of the latest job would make it better. In that scenario, there will be 3 cases: 1. There are no previous dbt runs: use `run` 2. Previous dbt run was failed: use `rerun` 3. Previous dbt run was successful: use `run` I will also change the error to a warning, and whenever `rerun` is being used, any of the overrides will not be taken into account. In any other cases, overrides can work as expected. I will make the necessary changes and let you know! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
