GitHub user pykenny edited a comment on the discussion: Using nested 
(decorated) function in decorated tasks (plus integration with TaskFlow)

Still tackling with Breeze installation/configuration, however I think being 
able to combine other decorators/wrappers with existing Airflow task decorators 
can be very useful when one wants to access multiple external services inside 
one task instead of having need to explicitly calling for hooks inside of the 
function, writing customized task decorator to combine all the things together 
(through provider plugin), or adding extra workarounds to break the task into 
several sequential steps (with corresponding operators).

```python
@task
@hook.mysql(conn_id="my_connection_sql", arg_name: "mysql_hook")
@hook.postgres(conn_id="my_connection_pg", arg_name: "pg_hook")
@hook.amazon.s3(conn_id="my_connection_s3", arg_name: "s3_hook")
def my_task(
    upstream_data_01: dict,
    upstream_data_02: str,
    mysql_hook = None,
    pg_hook = None,
    s3_hook = None
):
    ...
```

Equivalent to:

```python
@task
def my_task(
    upstream_data_01: dict,
    upstream_data_02: str,
):
    from airflow.providers.mysql.hooks.mysql import MySqlHook
    from airflow.providers.postgres.hooks.postgres import PostgresHook
    from airflow.providers.amazon.aws.hooks.s3 import S3Hook

    mysql_hook = MySqlHook("my_connection_sql")
    pg_hook = PostgresHook("my_connection_pg")
    s3_hook = S3Hook("my_connection_s3")
    ...
```

With that possible, official decorators, third-party provider decorators, and 
user-defined decorators can be used at the same time:

```python
@task.customized(task_id="my_task")  # Say `@task.customized` is registered by 
some third-party provider
@hook.mysql(conn_id="my_connection_sql", arg_name: "mysql_hook")
@hook.postgres(conn_id="my_connection_pg", arg_name: "pg_hook")
@hook.amazon.s3(conn_id="my_connection_s3", arg_name: "s3_hook")
@default_logger
def my_task(
    upstream_data_01: dict,
    upstream_data_02: str,
    mysql_hook = None,
    pg_hook = None,
    s3_hook = None,
    logger = None,
):
    ...
```

GitHub link: 
https://github.com/apache/airflow/discussions/45963#discussioncomment-11938206

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to