jeongukjae opened a new issue, #24594:
URL: https://github.com/apache/airflow/issues/24594

   ### Apache Airflow Provider(s)
   
   docker
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow==2.3.2 --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.3.2/constraints-3.8.txt";
   apache-airflow-providers-docker==3.0.*
   ```
   
   ### Apache Airflow version
   
   2.3.2 (latest released)
   
   ### Operating System
   
   MacOS
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   `airflow standalone` command. (dev environment)
   
   ### What happened
   
   When developing some DAGs in a standalone environment, I couldn't get 
context environment variables such as `AIRFLOW_CTX_EXECUTION_DATE` in 
`DockerOperator`.
   
   ### What you think should happen instead
   
   I could get the desired env vars in the container.
   
   ### How to reproduce
   
   It seems like every docker operator cannot get context env vars.
   
   ```python
   with DAG(
     pipeline_name,
     description=pipeline_description,
     ...
   ) as dag:
     DockerOperator(
       task_id="example_task_name",
       image="example_image",
       ...
     )
   ```
   
   ### Anything else
   
   logs:
   
   ```
   ...
   logs from taskinstance.py ...
   ...
   [2022-06-22, 15:29:21 UTC] {task_command.py:370} INFO - Running 
<TaskInstance: ... scheduled__2022-06-22T06:25:02.977109+00:00 map_index=2 
[running]> on host 
1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa
   [2022-06-22, 15:29:21 UTC] {taskinstance.py:1569} INFO - Exporting the 
following env vars:
   AIRFLOW_CTX_DAG_OWNER=...
   AIRFLOW_CTX_DAG_ID=...
   AIRFLOW_CTX_TASK_ID=...
   AIRFLOW_CTX_EXECUTION_DATE=2022-06-22T06:25:02.977109+00:00
   AIRFLOW_CTX_TRY_NUMBER=1
   AIRFLOW_CTX_DAG_RUN_ID=scheduled__2022-06-22T06:25:02.977109+00:00
   [2022-06-22, 15:29:22 UTC] {docker.py:248} INFO - Starting docker container 
from image ...
   ...
   Docker container logs ...
   ...
   ```
   
   The log said the following env vars are exported, but I cannot get any env 
var except `AIRFLOW_TMP_DIR `. 
   
   Here's the relevant code.
   
   In `DockerOperator.execute`, I cannot find the statements getting 
environment variables from the `context` argument. 
(https://github.com/apache/airflow/blob/providers-docker/3.0.0/airflow/providers/docker/operators/docker.py#L365-L389)
 By contrast,  in `BashOperator.execute`, there's the line to get the env vars 
from the `context` argument. 
(https://github.com/apache/airflow/blob/2.3.2/airflow/operators/bash.py#L184)
   
   It would be great if context environment variables are passed into the 
containers :)
   
   Thanks!
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to