boraberke commented on PR #38868: URL: https://github.com/apache/airflow/pull/38868#issuecomment-2081995764
> I feel like an error might makes more sense 🤔 I don't personally use dbt that much, but I guess `steps_override, schema_override, threads_override` could significantly change the behavior somehow. If that's the case, it might be better if we raise an error. But please correct me if I'm wrong 🙂 Thanks! Yes, those parameters change the behavior significantly. My only concern is with `try_number > 1` check, these parameters can actually work in the first run, i.e. try_number = 1. We can either; 1. Do not allow users to use `steps_override, schema_override, additional_run_config` when `rerun_from_failure` set to `True`. (Raise an error) 2. Keep it as it is and only show a warning when `try_number > 1`. Because in this case, in the first run, users will be able to use those overrides, and then the `rerun` would also do the same on the `dbt cloud` side by just rerunning the previous run as explained in the [docs](https://docs.getdbt.com/dbt-cloud/api-v2#/operations/Retry%20Failed%20Job). For me, it feels like second approach is more suitable as we do not limit the users, but it all depends on the `try_number` and can make it more complicated to understand. Let me know what you think :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
