BoneArch opened a new issue #14572:
URL: https://github.com/apache/airflow/issues/14572


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**: 2.0.1
   
   **Environment**:
   
   - **Docker Image**: apache/airflow:2.0.1-python3.8
   
   **What happened**:
   
   <!-- (please include exact error messages if you can) -->
   ```python
   from datetime import datetime
   from airflow import DAG
   from airflow.operators.python_operator import PythonOperator
   from airflow.operators.subdag_operator import SubDagOperator
   def print_date(**context):
       print(f'ds={context["ds"]}, next_ds={context["next_ds"]}, 
prev_ds={context["prev_ds"]}')
   def test_subdag(parent_dag_name, start_date, schedule_interval):
       with DAG(f'{parent_dag_name}.child_dag', 
schedule_interval=schedule_interval, start_date=start_date) as sub_dag:
           PythonOperator(task_id='inner_test', python_callable=print_date, 
provide_context=True)
           return sub_dag
   with DAG('test_dag', schedule_interval='0 21 * * *', 
start_date=datetime(2021, 2, 25)) as dag:
       PythonOperator(task_id='outer_test', python_callable=print_date, 
provide_context=True)
       SubDagOperator(task_id = 'child_dag', subdag=test_subdag(dag.dag_id, 
dag.start_date, dag.schedule_interval))
   ```
   Running the above code in 2.0 returns the following result:
   ```
   outer_test => ds=2021-02-25, next_ds=2021-02-26, prev_ds=2021-02-24
   inner_test => ds=2021-02-25, next_ds=2021-02-25, prev_ds=2021-02-25
   ```
   
   **What you expected to happen**:
   
   <!-- What do you think went wrong? -->
   Running the above code in 1.10.14 returns the following result:
   ```
   outer_test => ds=2021-02-25, next_ds=2021-02-26, prev_ds=2021-02-24
   inner_test => ds=2021-02-25, next_ds=2021-02-26, prev_ds=2021-02-24
   ```
   
   
   **How to reproduce it**:
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   Run DAG using docker-compose.yml below.
   
   docker-compose.yml
   ```yaml
   version: "3"
   
   services:
     redis:
       image: 'redis:6-alpine'
       volumes:
         - redis_data:/data
   
     postgres:
       image: "postgres:13-alpine"
       environment:
         POSTGRES_USER: airflow
         POSTGRES_PASSWORD: airflow
         POSTGRES_DB: airflow
       volumes:
         - postgres_data:/var/lib/postgresql/data
   
     webserver:
       image: "apache/airflow:2.0.1-python3.8"
       depends_on:
         - postgres
       ports:
         - "8080:8080"
       env_file:
         - ./airflow.env
       command: webserver
   
     scheduler:
       image: "apache/airflow:2.0.1-python3.8"
       depends_on:
         - webserver
       env_file:
         - ./airflow.env
       volumes:
         - ./dags:/opt/airflow/dags
       command: scheduler
   
     worker:
       image: "apache/airflow:2.0.1-python3.8"
       hostname: 'airflow-worker'
       depends_on:
         - redis
         - scheduler
       env_file:
         - ./airflow.env
       volumes:
         - ./dags:/opt/airflow/dags
       command: celery worker
   
     flower:
       image: "apache/airflow:2.0.1-python3.8"
       hostname: 'airflow-flower'
       depends_on:
         - redis
         - worker
       env_file:
         - ./airflow.env
       command: celery flower
   
   volumes:
     postgres_data:
     redis_data:
   ```
   
   airflow.env
   ```
   
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
   AIRFLOW__CORE__EXECUTOR=CeleryExecutor
   AIRFLOW__CELERY__BROKER_URL=redis://redis:6379/1
   
AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow
   _AIRFLOW_DB_UPGRADE=True
   _AIRFLOW_WWW_USER_CREATE=True
   _AIRFLOW_WWW_USER_PASSWORD=password
   ```
   
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   
   The problem occurs every time.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to