GitHub user Rishabh1627rawat added a comment to the discussion: Using 
DatabricksSubmitRunOperator inside @task — is pool applied correctly

Hi,

I noticed that in this implementation, the DatabricksSubmitRunOperator is 
created and its execute() method is called inside a @task-decorated function.

My senior implemented it this way and mentioned that the pool configuration is 
working correctly. However, I’m trying to better understand how this works 
internally.

Since the pool is defined on the @task, does that mean the pool slot is applied 
only to the outer TaskFlow task, and the Databricks operator is simply being 
executed as regular Python code rather than as a separately scheduled Airflow 
task?

I just want to clarify whether this is the intended and recommended pattern, or 
if defining the operator directly as a DAG task would be more appropriate.

Thanks in advance for the clarification!

GitHub link: 
https://github.com/apache/airflow/discussions/62403#discussioncomment-15919798

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to