Chris Bandy created AIRFLOW-3312:
------------------------------------

             Summary: No log output from BashOperator under test
                 Key: AIRFLOW-3312
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-3312
             Project: Apache Airflow
          Issue Type: Bug
          Components: logging, operators
    Affects Versions: 1.10.0
            Reporter: Chris Bandy


The BashOperator logs some messages as well as the stdout of its command at the 
info level, but none of these appear when running {{airflow test}} with the 
default configuration.

For example, this DAG emits the following in Airflow 1.10.0:
{code:python}
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime

dag = DAG('please', start_date=datetime(year=2018, month=11, day=1))

BashOperator(dag=dag, task_id='mine', bash_command='echo thank you')
{code}

{noformat}
$ airflow test please mine '2018-11-01'
[2018-11-08 00:06:54,098] {__init__.py:51} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:06:54,246] {models.py:258} INFO - Filling up the DagBag from 
/usr/local/airflow/dags
{noformat}

When executed by the scheduler, logs go to a file:

{noformat}
$ airflow scheduler -n 1
...
[2018-11-08 00:41:02,674] {dag_processing.py:582} INFO - Started a process 
(PID: 9) to generate tasks for /usr/local/airflow/dags/please.py
[2018-11-08 00:41:03,185] {dag_processing.py:495} INFO - Processor for 
/usr/local/airflow/dags/please.py finished
[2018-11-08 00:41:03,525] {jobs.py:1114} INFO - Tasks up for execution:
        <TaskInstance: please.mine 2018-11-01 00:00:00+00:00 [scheduled]>
[2018-11-08 00:41:03,536] {jobs.py:1147} INFO - Figuring out tasks to run in 
Pool(name=None) with 128 open slots and 1 task instances in queue
[2018-11-08 00:41:03,539] {jobs.py:1184} INFO - DAG please has 0/16 running and 
queued tasks
[2018-11-08 00:41:03,540] {jobs.py:1216} INFO - Setting the follow tasks to 
queued state:
        <TaskInstance: please.mine 2018-11-01 00:00:00+00:00 [scheduled]>
[2018-11-08 00:41:03,573] {jobs.py:1297} INFO - Setting the follow tasks to 
queued state:
        <TaskInstance: please.mine 2018-11-01 00:00:00+00:00 [queued]>
[2018-11-08 00:41:03,576] {jobs.py:1339} INFO - Sending ('please', 'mine', 
datetime.datetime(2018, 11, 1, 0, 0, tzinfo=<Timezone [UTC]>)) to executor with 
priority 1 and queue default
[2018-11-08 00:41:03,578] {base_executor.py:56} INFO - Adding to queue: airflow 
run please mine 2018-11-01T00:00:00+00:00 --local -sd 
/usr/local/airflow/dags/please.py
[2018-11-08 00:41:03,593] {sequential_executor.py:45} INFO - Executing command: 
airflow run please mine 2018-11-01T00:00:00+00:00 --local -sd 
/usr/local/airflow/dags/please.py
[2018-11-08 00:41:04,262] {__init__.py:51} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:41:04,406] {models.py:258} INFO - Filling up the DagBag from 
/usr/local/airflow/dags/please.py
[2018-11-08 00:41:04,458] {cli.py:492} INFO - Running <TaskInstance: 
please.mine 2018-11-01T00:00:00+00:00 [queued]> on host e2e08cf4dfaa
[2018-11-08 00:41:09,684] {jobs.py:1443} INFO - Executor reports please.mine 
execution_date=2018-11-01 00:00:00+00:00 as success

$ cat logs/please/mine/2018-11-01T00\:00\:00+00\:00/1.log
[2018-11-08 00:41:04,554] {models.py:1335} INFO - Dependencies all met for 
<TaskInstance: please.mine 2018-11-01T00:00:00+00:00 [queued]>
[2018-11-08 00:41:04,564] {models.py:1335} INFO - Dependencies all met for 
<TaskInstance: please.mine 2018-11-01T00:00:00+00:00 [queued]>
[2018-11-08 00:41:04,565] {models.py:1547} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 1
--------------------------------------------------------------------------------

[2018-11-08 00:41:04,605] {models.py:1569} INFO - Executing 
<Task(BashOperator): mine> on 2018-11-01T00:00:00+00:00
[2018-11-08 00:41:04,605] {base_task_runner.py:124} INFO - Running: ['bash', 
'-c', 'airflow run please mine 2018-11-01T00:00:00+00:00 --job_id 142 --raw -sd 
DAGS_FOLDER/please.py --cfg_path /tmp/tmp9prq7knr']
[2018-11-08 00:41:05,214] {base_task_runner.py:107} INFO - Job 142: Subtask 
mine [2018-11-08 00:41:05,213] {__init__.py:51} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:41:05,334] {base_task_runner.py:107} INFO - Job 142: Subtask 
mine [2018-11-08 00:41:05,333] {models.py:258} INFO - Filling up the DagBag 
from /usr/local/airflow/dags/please.py
[2018-11-08 00:41:05,368] {base_task_runner.py:107} INFO - Job 142: Subtask 
mine [2018-11-08 00:41:05,367] {cli.py:492} INFO - Running <TaskInstance: 
please.mine 2018-11-01T00:00:00+00:00 [running]> on host e2e08cf4dfaa
[2018-11-08 00:41:05,398] {bash_operator.py:74} INFO - Tmp dir root location:
 /tmp
[2018-11-08 00:41:05,398] {bash_operator.py:87} INFO - Temporary script 
location: /tmp/airflowtmp0is6wwxi/mine8tmew5y4
[2018-11-08 00:41:05,399] {bash_operator.py:97} INFO - Running command: echo 
thank you
[2018-11-08 00:41:05,402] {bash_operator.py:106} INFO - Output:
[2018-11-08 00:41:05,404] {bash_operator.py:110} INFO - thank you
[2018-11-08 00:41:05,404] {bash_operator.py:114} INFO - Command exited with 
return code 0
[2018-11-08 00:41:09,504] {logging_mixin.py:95} INFO - [2018-11-08 
00:41:09,503] {jobs.py:2612} INFO - Task exited with return code 0
{noformat}

&nbsp;
----

This appears to be a regression. In Airflow 1.9.0, the same DAG with default 
configuration emits:
{noformat}
$ airflow test please mine '2018-11-01'
[2018-11-08 00:05:51,957] {driver.py:123} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/Grammar.txt
[2018-11-08 00:05:51,995] {driver.py:123} INFO - Generating grammar tables from 
/usr/lib/python3.5/lib2to3/PatternGrammar.txt
[2018-11-08 00:05:52,174] {__init__.py:45} INFO - Using executor 
SequentialExecutor
[2018-11-08 00:05:52,256] {models.py:189} INFO - Filling up the DagBag from 
/usr/local/airflow/dags
[2018-11-08 00:05:57,172] {bash_operator.py:70} INFO - Tmp dir root location:
 /tmp
[2018-11-08 00:05:57,173] {bash_operator.py:80} INFO - Temporary script 
location: /tmp/airflowtmpq0fefk_2//tmp/airflowtmpq0fefk_2/minegr_ipkhi
[2018-11-08 00:05:57,173] {bash_operator.py:88} INFO - Running command: echo 
thank you
[2018-11-08 00:05:57,180] {bash_operator.py:97} INFO - Output:
[2018-11-08 00:05:57,182] {bash_operator.py:101} INFO - thank you
[2018-11-08 00:05:57,182] {bash_operator.py:105} INFO - Command exited with 
return code 0
{noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to