Lee-W commented on issue #34206:
URL: https://github.com/apache/airflow/issues/34206#issuecomment-2342600710

   how about using it with dynamic task mapping?
   
   ```python
   from __future__ import annotations
   
   from pendulum import datetime
   
   from airflow.datasets import Dataset, DatasetAlias
   from airflow.datasets.metadata import Metadata
   from airflow.decorators import dag, task
   
   my_alias_name = "alias-dataset"
   
   
   @dag(
       dag_display_name="TEST Alias upstream",
       start_date=datetime(2024, 8, 1),
       schedule=None,
       catchup=False,
       tags=["Test"],
   )
   def dataset_alias_dynamic_test():
       @task
       def upstream_task():
           return ["a", "b"]
   
       @task(outlets=[DatasetAlias(my_alias_name)])
       def use_metadata(name):
           yield Metadata(
               Dataset(name),
               alias=my_alias_name,
               extra={},  # extra is NOT optional
           )
   
       use_metadata.expand(name=upstream_task())
   
   
   dataset_alias_dynamic_test()
   
   
   @dag(start_date=datetime(2024, 8, 1), 
schedule=[DatasetAlias(my_alias_name)], catchup=False, tags=["Test"])
   def downstream_alias():
       @task
       def t1():
           return 0
   
       t1()
   
   
   downstream_alias()
   ```
   
   change `upstream_task` to the task that returns  successful tables


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to