Jake-Gillberg opened a new issue #14838:
URL: https://github.com/apache/airflow/issues/14838


   **Apache Airflow version / Environment**:
   apache/airflow:2.0.1-python3.8
   
   **What happened**:
   
   This is a dev testing password, so I don't mind that it is leaked in the 
error message here:
   
   `ERROR - Failed to execute task Cannot execute bash -cl 'echo 
postgresql+psycopg2://airflow:hAY&Bv+w*,q24+JN@airflow_db/airflow'. Error code 
is: 127. Output: postgresql+psycopg2://airflow:hAY , Stderr: bash: 
Bv+w*,q24+JN@airflow_db/airflow: No such file or directory.
   `
   **What you expected to happen**:
   
   Expected the tutorial task to complete successfully.
   
   **What do you think went wrong?**:
   
   Using docker .env to hold sql connection information. Have specified 
`AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD` as `bash -cl 'echo 
postgresql+psycopg2://$${POSTGRES_USER}:$${POSTGRES_PASSWORD}@airflow_db/$${POSTGRES_DB}'`
 in `docker-compose.yml` (double `$$` to escape docker's replacement). This 
seems to work fine on initialization (webserver, scheduler, and init all run 
successfully) up until the local executor tries to log the completion of the 
task.
   
   ```
   airflow@4bdf5cf69c11:/opt/airflow$ airflow dags backfill tutorial 
--start-date 2015-06-01 --end-date 2015-06-07
   
/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/dag_command.py:60
 PendingDeprecationWarning: --ignore-first-depends-on-past is deprecated as the 
value is always set to True
   [2021-03-16 16:37:36,822] {dagbag.py:448} INFO - Filling up the DagBag from 
/opt/airflow/dags
   [2021-03-16 16:37:36,945] {backfill_job.py:902} INFO - Reset the following 2 
TaskInstances:
           <TaskInstance: tutorial.sleep 2015-06-01 00:00:00+00:00 [None]>
           <TaskInstance: tutorial.templated 2015-06-01 00:00:00+00:00 [None]>
   [2021-03-16 16:37:36,968] {base_executor.py:82} INFO - Adding to queue: 
['airflow', 'tasks', 'run', 'tutorial', 'print_date', 
'2015-06-01T00:00:00+00:00', '--ignore-depends-on-past', '--local', '--pool', 
'default_pool', '--subdir', 'DAGS_FOLDER/example.py', '--cfg-path', 
'/tmp/tmp5ywf__ot']
   [2021-03-16 16:37:41,847] {local_executor.py:81} INFO - QueuedLocalWorker 
running ['airflow', 'tasks', 'run', 'tutorial', 'print_date', 
'2015-06-01T00:00:00+00:00', '--ignore-depends-on-past', '--local', '--pool', 
'default_pool', '--subdir', 'DAGS_FOLDER/example.py', '--cfg-path', 
'/tmp/tmp5ywf__ot']
   [2021-03-16 16:37:41,855] {backfill_job.py:377} INFO - [backfill progress] | 
finished run 0 of 1 | tasks waiting: 2 | succeeded: 0 | running: 1 | failed: 0 
| skipped: 0 | deadlocked: 0 | not ready: 2
   [2021-03-16 16:37:41,877] {dagbag.py:448} INFO - Filling up the DagBag from 
/opt/airflow/dags/example.py
   Running <TaskInstance: tutorial.print_date 2015-06-01T00:00:00+00:00 
[queued]> on host 4bdf5cf69c11
   [2021-03-16 16:37:41,974] {local_executor.py:127} ERROR - Failed to execute 
task Cannot execute bash -cl 'echo 
postgresql+psycopg2://airflow:hAY&Bv+w*,q24+JN@airflow_db/airflow'. Error code 
is: 127. Output: postgresql+psycopg2://airflow:hAY
   , Stderr: bash: Bv+w*,q24+JN@airflow_db/airflow: No such file or directory
   .
   [2021-03-16 16:37:46,873] {backfill_job.py:282} ERROR - Executor reports 
task instance <TaskInstance: tutorial.print_date 2015-06-01 00:00:00+00:00 
[queued]> finished (failed) although the task says its queued. Was the task 
killed externally? Info: None
   [2021-03-16 16:37:46,874] {taskinstance.py:1455} ERROR - Executor reports 
task instance <TaskInstance: tutorial.print_date 2015-06-01 00:00:00+00:00 
[queued]> finished (failed) although the task says its queued. Was the task 
killed externally? Info: None
   NoneType: None
   [2021-03-16 16:37:46,874] {taskinstance.py:1496} INFO - Marking task as 
UP_FOR_RETRY. dag_id=tutorial, task_id=print_date, 
execution_date=20150601T000000, start_date=20210316T160557, 
end_date=20210316T163746
   [2021-03-16 16:37:46,885] {backfill_job.py:225} WARNING - Task instance 
<TaskInstance: tutorial.print_date 2015-06-01 00:00:00+00:00 [up_for_retry]> is 
up for retry
   [2021-03-16 16:37:46,890] {backfill_job.py:377} INFO - [backfill progress] | 
finished run 0 of 1 | tasks waiting: 3 | succeeded: 0 | running: 0 | failed: 0 
| skipped: 0 | deadlocked: 0 | not ready: 2
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to