GitHub user pykenny edited a comment on the discussion: Using nested
(decorated) function in decorated tasks (plus integration with TaskFlow)
Still tackling with Breeze installation/configuration, however I think being
able to combine other decorators/wrappers with existing Airflow task decorators
can be very useful when one wants to access multiple external services inside
one task instead of explicitly calling for hooks inside of the function, or
adding extra workarounds to break the task into several sequential steps (with
corresponding operators).
```python
@task
@hook.mysql(conn_id="my_connection_sql", arg_name: "mysql_hook")
@hook.postgres(conn_id="my_connection_pg", arg_name: "pg_hook")
@hook.amazon.s3(conn_id="my_connection_s3", arg_name: "s3_hook")
def my_task(
upstream_data_01: dict,
upstream_data_02: str,
mysql_hook = None,
pg_hook = None,
s3_hook = None
):
...
```
Equivalent to:
```python
@task
def my_task(
upstream_data_01: dict,
upstream_data_02: str,
):
from airflow.providers.mysql.hooks.mysql import MySqlHook
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.providers.amazon.aws.hooks.s3 import S3Hook
mysql_hook = MySqlHook("my_connection_sql")
pg_hook = PostgresHook("my_connection_pg")
s3_hook = S3Hook("my_connection_s3")
...
```
GitHub link:
https://github.com/apache/airflow/discussions/45963#discussioncomment-11938206
----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]