nik-davis commented on pull request #13893:
URL: https://github.com/apache/airflow/pull/13893#issuecomment-767403524


   Thanks @kaxil for the fix, we'll try that out today and let you know if 
there's any issues.
   
   I haven't been able to reproduce the error locally yet - I imagine with a 
local DB connection the DAG serialization is happening too fast to trigger the 
race condition. I'm wondering if adding a large file import to one of the 
operators could artificially slow it down a bit, but I don't know enough about 
how the serialization works to know for sure.
   
   I've only managed to get the error locally by removing a row from the 
serialized_dag table. It looks like the try/excepts you've added would handle 
this though as well.
   
   Here's the script I was toying with locally anyway to try and trigger the 
error, in case that's of use (increase the value in the range function to 
generate more DAGs):
   
   ```
   from airflow.decorators import dag, task
   from airflow.utils.dates import days_ago
   
   
   def create_dag(dag_id):
       @dag(
           default_args={"owner": "airflow", "start_date": days_ago(1)},
           schedule_interval="@once",
           dag_id=dag_id,
           catchup=False,
       )
       def dynamic_dag():
           @task()
           def dynamic_task_1(**kwargs):
               return 1
   
           task_1 = dynamic_task_1()
           task_2 = dynamic_task_1(task_1)
   
       return dynamic_dag()
   
   
   for i in range(2):
       dag_id = f"dynamic_dag_{i}"
       globals()[dag_id] = create_dag(dag_id)
   
   ```
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to