HugoCornu opened a new issue, #25942:
URL: https://github.com/apache/airflow/issues/25942

   ### Apache Airflow version
   
   Other Airflow 2 version
   
   ### What happened
   
   Every time we make a new dag. The first time we launch it, it runs twice.
   
   
   
   ### What you think should happen instead
   
   The dag should run only once.
   
   ### How to reproduce
   
   Here is a dag with the issue :  
   
   ```
   from datetime import datetime, timedelta
   from textwrap import dedent
   
   # The DAG object; we'll need this to instantiate a DAG
   from airflow import DAG
   
   # Operators; we need this to operate!
   from airflow.operators.bash import BashOperator
   
   with DAG(
       "tutorial",
       # These args will get passed on to each operator
       # You can override them on a per-task basis during operator 
initialization
       default_args={
           "depends_on_past": False,
           "email": ["[email protected]"],
           "email_on_failure": False,
           "email_on_retry": False,
           "retries": 1,
           "retry_delay": timedelta(minutes=5),
       },
       description="A simple tutorial DAG",
       schedule_interval=timedelta(days=1),
       start_date=datetime(2021, 1, 1),
       catchup=False,
       tags=["example"],
   ) as dag:
   
       # t1, t2 and t3 are examples of tasks created by instantiating operators
       t1 = BashOperator(
           task_id="print_date",
           bash_command="date",
       )
   
       t2 = BashOperator(
           task_id="sleep",
           depends_on_past=False,
           bash_command="sleep 5",
           retries=3,
       )
       t1.doc_md = dedent(
           """\
       #### Task Documentation
       You can document your task using the attributes `doc_md` (markdown),
       `doc` (plain text), `doc_rst`, `doc_json`, `doc_yaml` which gets
       rendered in the UI's Task Instance Details page.
       
![img](http://montcs.bloomu.edu/~bobmon/Semesters/2012-01/491/import%20soul.png)
   
       """
       )
   
       dag.doc_md = (
           __doc__  # providing that you have a docstring at the beginning of 
the DAG
       )
       dag.doc_md = """
       This is a documentation placed anywhere
       """  # otherwise, type it like this
       templated_command = dedent(
           """
       {% for i in range(5) %}
           echo "{{ ds }}"
           echo "{{ macros.ds_add(ds, 7)}}"
       {% endfor %}
       """
       )
   
       t3 = BashOperator(
           task_id="templated",
           depends_on_past=False,
           bash_command=templated_command,
       )
   
       t1 >> [t2, t3]
   ```
   
   ### Operating System
   
   its on AWS MWAA (linux I guess)
   
   ### Versions of Apache Airflow Providers
   
   <html><body>
   <!--StartFragment-->
   
   Package Name | Version | Description
   -- | -- | --
   apache-airflow-providers-amazon | 2.4.0 | Amazon integration (including 
Amazon Web Services (AWS)).
   apache-airflow-providers-celery | 2.1.0 | Celery
   apache-airflow-providers-docker | 3.1.0 | Docker
   apache-airflow-providers-ftp | 2.0.1 | File Transfer Protocol (FTP)
   apache-airflow-providers-http | 2.0.1 | Hypertext Transfer Protocol (HTTP)
   apache-airflow-providers-imap | 2.0.1 | Internet Message Access Protocol 
(IMAP)
   apache-airflow-providers-postgres | 2.3.0 | PostgreSQL
   apache-airflow-providers-sqlite | 2.0.1 | SQLite
   
   <!--EndFragment-->
   </body>
   </html>
   
   ### Deployment
   
   MWAA
   
   ### Deployment details
   
   MWAA with airflow 2.2.2
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to