Hi,
I have a python operator in a DAG which calls a functions that prints
"Hello World!" to stdout. When I run the DAG with airflow test I see the
print:
@airflow]$ airflow test test_daily print_date_python 2017-02-01
[2017-03-03 11:09:16,659] {__init__.py:36} INFO - Using executor
CeleryExecutor
[2017-03-03 11:09:23,854] {models.py:154} INFO - Filling up the DagBag from
/home/airflow/dags/dev
[2017-03-03 11:09:23,934] {models.py:1196} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 2
--------------------------------------------------------------------------------
[2017-03-03 11:09:23,935] {models.py:1219} INFO - Executing
<Task(PythonOperator): print_date_python> on 2017-02-01 00:00:00
Hello World!
[2017-03-03 11:09:23,941] {python_operator.py:67} INFO - Done. Returned
value was: None
However, when I schedule the job with CeleryExecutor - I do not see the
print in the log, this is what I see:
[2017-03-03 15:54:11,453] {models.py:154} INFO - Filling up the DagBag from
/netusers/testop/airflow/dags/dev/trth_tas_daily.py
[2017-03-03 15:54:23,447] {models.py:154} INFO - Filling up the DagBag from
/netusers/testop/airflow/dags/dev/trth_tas_daily.py
[2017-03-03 15:54:23,490] {models.py:1196} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 2
--------------------------------------------------------------------------------
[2017-03-03 15:54:23,520] {models.py:1219} INFO - Executing
<Task(PythonOperator): print_date_python> on 2017-03-03 15:53:00
[2017-03-03 15:54:23,527] {python_operator.py:67} INFO - Done. Returned
value was: None
Any ideas?
Thanks,
-A