nathadfield opened a new issue, #50389:
URL: https://github.com/apache/airflow/issues/50389

   ### Apache Airflow version
   
   3.0.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   In Airflow 3, using `default_args` that include keys not accepted by a 
particular operator now causes `.partial()` to raise a TypeError during dynamic 
task mapping (expand() or expand_kwargs()).
   
   This strict argument validation did not exist in Airflow 2.x, where it was 
common and valid to use shared default_args (e.g., poke_interval, email, etc.) 
across multiple operators, even if some operators ignored keys they didn’t use. 
The change in Airflow 3 breaks this pattern, and results in DAGs failing to 
parse.
   
   For example, a DAG that uses a sensor (which accepts poke_interval) and a 
BashOperator (which does not), and shares the same default_args, will now fail 
when .partial() is used with BashOperator.
   
   ### What you think should happen instead?
   
   Airflow should ignore irrelevant default_args keys during .partial() 
resolution — just as it does during regular operator instantiation — or provide 
a clear mechanism for filtering them automatically.
   
   At a minimum, if strict validation is the intended design going forward, 
this should be:
   
   - Documented as a breaking change in Airflow 3,
   - and ideally offer a pattern for DAG authors to safely reuse default_args 
with mixed operator types and dynamic task mapping.
   
   ### How to reproduce
   
   Here’s a minimal DAG that fails in Airflow 3:
   
   ```
   from airflow.sdk import DAG, task
   from airflow.providers.standard.operators.bash import BashOperator
   
   default_args = {
       'start_date': None,
       'poke_interval': 300,  # Valid for sensors, invalid for BashOperator
   }
   
   with DAG(
       dag_id='dynamic_task_mapping_default_args_bug',
       default_args=default_args,
       schedule=None,
       catchup=False,
   ) as dag:
   
       @task
       def get_commands():
           return [{"bash_command": "echo hello"}]
   
       BashOperator.partial(
           task_id="run_bash"
       ).expand_kwargs(get_commands())
   ```
   
   The DAG fails to parse with the following error:
   ```
   Traceback (most recent call last):
     File 
"/opt/airflow/task-sdk/src/airflow/sdk/definitions/mappedoperator.py", line 
171, in __attrs_post_init__
       validate_mapping_kwargs(self.operator_class, "partial", self.kwargs)
     File 
"/opt/airflow/task-sdk/src/airflow/sdk/definitions/mappedoperator.py", line 
118, in validate_mapping_kwargs
       raise TypeError(f"{op.__name__}.{func}() got {error}")
   TypeError: BashOperator.partial() got an unexpected keyword argument 
'poke_interval'
   ```
   
   ### Operating System
   
   PRETTY_NAME="Debian GNU/Linux 12 (bookworm)" NAME="Debian GNU/Linux" 
VERSION_ID="12" VERSION="12 (bookworm)" VERSION_CODENAME=bookworm ID=debian 
HOME_URL="https://www.debian.org/"; SUPPORT_URL="https://www.debian.org/support"; 
BUG_REPORT_URL="https://bugs.debian.org/";
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   N/A
   
   ### Anything else?
   
   - This issue affects all operators that don’t support a subset of keys in 
default_args when used with .partial() and dynamic task mapping.
   
   - It breaks backward compatibility with many Airflow 2.x DAGs that used 
shared default_args.
   
   ### Are you willing to submit PR?
   
   - [x] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to