[
https://issues.apache.org/jira/browse/AIRFLOW-3064?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16649417#comment-16649417
]
ASF GitHub Bot commented on AIRFLOW-3064:
-----------------------------------------
ashb opened a new pull request #4051: [AIRFLOW-3064] Show logs/output from
operators in `airflow test` command
URL: https://github.com/apache/incubator-airflow/pull/4051
Make sure you have checked _all_ steps below.
### Jira
- [x] https://issues.apache.org/jira/browse/AIRFLOW-3064
### Description
- [x] The logging rejig we did a for 1.10 ened with us the output from
operators/tasks when running `airflow test` not going anywhere (because
we have call `ti.init_run_context()` the FileTaskHandler doens't have a
filename, so logs don't go anywhere)
### Tests
- [x] My PR adds the following unit tests: added tests to test_cli that
validates the output is actually printed.
### Commits
- [x] My commits all reference Jira issues in their subject lines, and I
have squashed multiple commits if they address the same issue. In addition, my
commits follow the guidelines from "[How to write a good git commit
message](http://chris.beams.io/posts/git-commit/)":
1. Subject is separated from body by a blank line
1. Subject is limited to 50 characters (not including Jira issue reference)
1. Subject does not end with a period
1. Subject uses the imperative mood ("add", not "adding")
1. Body wraps at 72 characters
1. Body explains "what" and "why", not "how"
### Documentation
- [x] In case of new functionality, my PR adds documentation that describes
how to use it.
- When adding new operators/hooks/sensors, the autoclass documentation
generation needs to be added.
### Code Quality
- [x] Passes `flake8`
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
> Tutorial says to expect output, but no output due to default logging config
> ---------------------------------------------------------------------------
>
> Key: AIRFLOW-3064
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3064
> Project: Apache Airflow
> Issue Type: Bug
> Components: Documentation, logging
> Affects Versions: 1.10.0
> Environment: CentOS release 6.5, Python 2.7.5
> Reporter: Brian King
> Assignee: Ash Berlin-Taylor
> Priority: Minor
>
> On [https://airflow.incubator.apache.org/tutorial.html#id1] , it says about
> running the test commands:
> {quote}This should result in displaying a verbose log of events and
> ultimately running your bash command and printing the result.
> Note that the {{airflow test}} command runs task instances locally, outputs
> their log to stdout (on screen), ...
> {quote}
> In fact, there is some logging output, but no output of the tasks:
> {code:java}
> $ airflow test tutorial print_date 2015-06-01
> [2018-09-14 14:43:58,380] {__init__.py:51} INFO - Using executor
> SequentialExecutor
> [2018-09-14 14:43:58,493] {models.py:258} INFO - Filling up the DagBag from
> /vagrant/airflow/dags
> [2018-09-14 14:43:58,571] {example_kubernetes_operator.py:54} WARNING - Could
> not import KubernetesPodOperator: No module named kubernetes
> [2018-09-14 14:43:58,572] {example_kubernetes_operator.py:55} WARNING -
> Install kubernetes dependencies with: pip install
> airflow['kubernetes']{code}
>
> I looked at the logging config, and thought that perhaps the task output
> would be logged to a file (since the default logging config's task handler
> logs to files), but I didn't find anything (relevant) in the log directory.
> To see the task output, I had to use a custom logging config, based on the
> DEFAULT_LOGGING_CONFIG, that used the console handler instead of the task
> handler for the 'airflow.task' logger:
> {code:java}
> 'loggers': {
> 'airflow.task': {
> # 'handlers': ['task'],
> 'handlers': ['console'],
> 'level': 'INFO',
> 'propagate': False,
> },{code}
> This results in the task output showing up:
> {code:java}
> $ airflow test tutorial print_date 2015-06-01
> [2018-09-14 14:49:16,897] {__init__.py:51} INFO - Using executor
> SequentialExecutor
> [2018-09-14 14:49:17,017] {models.py:258} INFO - Filling up the DagBag from
> /vagrant/airflow/dags
> [2018-09-14 14:49:17,093] {example_kubernetes_operator.py:54} WARNING - Could
> not import KubernetesPodOperator: No module named kubernetes
> [2018-09-14 14:49:17,093] {example_kubernetes_operator.py:55} WARNING -
> Install kubernetes dependencies with: pip install airflow['kubernetes']
> [2018-09-14 14:49:17,176] {models.py:1335} INFO - Dependencies all met for
> <TaskInstance: tutorial.print_date 2015-06-01T00:00:00+00:00 [None]>
> [2018-09-14 14:49:17,179] {models.py:1335} INFO - Dependencies all met for
> <TaskInstance: tutorial.print_date 2015-06-01T00:00:00+00:00 [None]>
> [2018-09-14 14:49:17,179] {models.py:1547} INFO -
> --------------------------------------------------------------------------------
> Starting attempt 1 of 2
> --------------------------------------------------------------------------------
> [2018-09-14 14:49:17,180] {models.py:1569} INFO - Executing
> <Task(BashOperator): print_date> on 2015-06-01T00:00:00+00:00
> [2018-09-14 14:49:17,236] {bash_operator.py:74} INFO - Tmp dir root location:
> /tmp
> [2018-09-14 14:49:17,237] {bash_operator.py:87} INFO - Temporary script
> location: /tmp/airflowtmp6ieJDv/print_dateZV3cw8
> [2018-09-14 14:49:17,237] {bash_operator.py:97} INFO - Running command: date
> [2018-09-14 14:49:17,241] {bash_operator.py:106} INFO - Output:
> [2018-09-14 14:49:17,250] {bash_operator.py:110} INFO - Fri Sep 14 14:49:17
> UTC 2018
> [2018-09-14 14:49:17,252] {bash_operator.py:114} INFO - Command exited with
> return code 0{code}
>
> That change to the logging config is probably not the appropriate change to
> make for real life usage, but for someone working through the tutorial, it's
> nice to see the output.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)