ashb commented on a change in pull request #9363:
URL: https://github.com/apache/airflow/pull/9363#discussion_r445423264
##########
File path: airflow/task/task_runner/standard_task_runner.py
##########
@@ -73,11 +74,24 @@ def _start_by_fork(self): # pylint:
disable=inconsistent-return-statements
# [1:] - remove "airflow" from the start of the command
args = parser.parse_args(self._command[1:])
+ self.log.info('Running: %s', self._command)
+ self.log.info('Job %s: Subtask %s', self._task_instance.job_id,
self._task_instance.task_id)
+
proc_title = "airflow task runner: {0.dag_id} {0.task_id}
{0.execution_date}"
if hasattr(args, "job_id"):
proc_title += " {0.job_id}"
setproctitle(proc_title.format(args))
+ # Get all the Handlers from 'airflow.task' logger
+ # Add these handlers to the root logger so that we can get logs
from
+ # any custom loggers defined in the DAG
+ airflow_logger_handlers =
logging.getLogger('airflow.task').handlers
+ root_logger = logging.getLogger()
+ for handler in airflow_logger_handlers:
+ if isinstance(handler, FileTaskHandler):
Review comment:
I think actually we should just copy any and all handlers assigned to
airflow.task (i.e. remove the if altogether) -- the purpose here is to make all
logs produced when running a task go to the task file.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]