perry2of5 commented on issue #43361:
URL: https://github.com/apache/airflow/issues/43361#issuecomment-2436246932

   Looks like AWS S3 sensor solves this without breaking backwards 
compatibility by inspecting a function to determine if a context can be passed 
in:
   
https://github.com/apache/airflow/blob/main/providers/src/airflow/providers/amazon/aws/sensors/s3.py#L169
   
   Google Cloud DataprocCreateClusterOperator does the opposite and removes an 
argument for backwards compatibility:
   
https://github.com/apache/airflow/blob/main/providers/src/airflow/providers/google/cloud/operators/dataproc.py#L680
   
   The Snowflake provider injects the session into the operator kwargs if and 
only if it was passed into the function:
   
https://github.com/apache/airflow/blob/main/providers/src/airflow/providers/snowflake/utils/snowpark.py#L40
   
   I think I'll fix this by allowing the callback to include the context if it 
wants to and then the operator can inspect the arguments and only pass the 
context if the callback can handle it.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to