Aymen1617 opened a new issue, #57194:
URL: https://github.com/apache/airflow/issues/57194

   ### Apache Airflow version
   
   3.1.0
   
   ### If "Other Airflow 2/3 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   **my first dag**
   
   from airflow import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   from airflow.providers.standard.operators.trigger_dagrun import 
TriggerDagRunOperator
   from datetime import datetime
   
   def fetch_job():
       job_id = 123
       target_id = 987
       conf = {"job_id": job_id, "target_id": target_id}
       print(f"[DAG A] Prepared conf to pass: {conf}")
       return conf  # pushed to XCom automatically
   
   def trigger_dag_b_callable(**kwargs):
       ti = kwargs['ti']
       conf = ti.xcom_pull(task_ids='fetch_job')  # this is a dict
       print(f"[DAG A] Triggering DAG B with conf: {conf}")
   
       TriggerDagRunOperator(
           task_id="trigger_dag_b_inner",
           trigger_dag_id="dag_b",
           conf=conf,
           wait_for_completion=False
       ).execute(context=kwargs)
   
   with DAG(
       dag_id="dag_a",
       start_date=datetime(2025, 1, 1),
       schedule=None,
       catchup=False
   ) as dag:
   
       fetch_job_task = PythonOperator(
           task_id="fetch_job",
           python_callable=fetch_job
       )
   
       trigger_dag_b_task = PythonOperator(
           task_id="trigger_dag_b",
           python_callable=trigger_dag_b_callable
       )
   
       fetch_job_task >> trigger_dag_b_task
   the logs of this is fine see them 
   
   <img width="995" height="543" alt="Image" 
src="https://github.com/user-attachments/assets/74025374-84d0-4b5e-9bf0-8b3e5c1711f1";
 />
   and 
   
   <img width="993" height="517" alt="Image" 
src="https://github.com/user-attachments/assets/3a8cd188-aacc-4bb8-9c87-c05ffaf66b11";
 />
   **my second dag**
   from airflow import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   from datetime import datetime
   
   def process_target(**kwargs):
       dag_run = kwargs.get('dag_run')
       conf = dag_run.conf if dag_run else {}
       print(f"[DAG B] Received conf: {conf}")
   
       job_id = conf.get("job_id")
       target_id = conf.get("target_id")
       print(f"[DAG B] Processing job_id={job_id}, target_id={target_id}")
   
   with DAG(
       dag_id="dag_b",
       start_date=datetime(2025, 1, 1),
       schedule=None,
       catchup=False
   ) as dag:
   
       process_task = PythonOperator(
           task_id="process_target",
           python_callable=process_target    )
   It should get data from one dag but its showing **none**
   see the logs 
   
   <img width="1040" height="391" alt="Image" 
src="https://github.com/user-attachments/assets/699bd65e-a320-4d6b-85a5-622c07c53e0d";
 />
   
   
   ### What you think should happen instead?
   
   _No response_
   
   ### How to reproduce
   
   save this file and trigger it
   `from airflow import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   from airflow.providers.standard.operators.trigger_dagrun import 
TriggerDagRunOperator
   from datetime import datetime
   
   def fetch_job():
       job_id = 123
       target_id = 987
       conf = {"job_id": job_id, "target_id": target_id}
       print(f"[DAG A] Prepared conf to pass: {conf}")
       return conf  # pushed to XCom automatically
   
   def trigger_dag_b_callable(**kwargs):
       ti = kwargs['ti']
       conf = ti.xcom_pull(task_ids='fetch_job')  # this is a dict
       print(f"[DAG A] Triggering DAG B with conf: {conf}")
   
       TriggerDagRunOperator(
           task_id="trigger_dag_b_inner",
           trigger_dag_id="dag_b",
           conf=conf,
           wait_for_completion=False
       ).execute(context=kwargs)
   
   with DAG(
       dag_id="dag_a",
       start_date=datetime(2025, 1, 1),
       schedule=None,
       catchup=False
   ) as dag:
   
       fetch_job_task = PythonOperator(
           task_id="fetch_job",
           python_callable=fetch_job
       )
   
       trigger_dag_b_task = PythonOperator(
           task_id="trigger_dag_b",
           python_callable=trigger_dag_b_callable
       )
   
       fetch_job_task >> trigger_dag_b_task
    `
   
   then run this 
   `from airflow import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   from datetime import datetime
   
   def process_target(**kwargs):
       dag_run = kwargs.get('dag_run')
       conf = dag_run.conf if dag_run else {}
       print(f"[DAG B] Received conf: {conf}")
   
       job_id = conf.get("job_id")
       target_id = conf.get("target_id")
       print(f"[DAG B] Processing job_id={job_id}, target_id={target_id}")
   
   with DAG(
       dag_id="dag_b",
       start_date=datetime(2025, 1, 1),
       schedule=None,
       catchup=False
   ) as dag:
   
       process_task = PythonOperator(
           task_id="process_target",
           python_callable=process_target    )
   `
   
   ### Operating System
   
   linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to