[
https://issues.apache.org/jira/browse/AIRFLOW-4719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16963863#comment-16963863
]
Alan edited comment on AIRFLOW-4719 at 10/31/19 10:59 AM:
----------------------------------------------------------
I'm currently using ver 1.10.5, meet the similar issue:
{noformat}
Oct 31 05:56:35 build-node airflow: Traceback (most recent call last):
Oct 31 05:56:35 build-node airflow: File
"/usr/lib64/python3.6/multiprocessing/process.py", line 258, in _bootstrap
Oct 31 05:56:35 build-node airflow: self.run()
Oct 31 05:56:35 build-node airflow: File
"/usr/lib64/python3.6/multiprocessing/process.py", line 93, in run
Oct 31 05:56:35 build-node airflow: self._target(*self._args, **self._kwargs)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line
128, in _run_file_processor
Oct 31 05:56:35 build-node airflow: set_context(log, file_path)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/utils/log/logging_mixin.py",
line 170, in set_context
Oct 31 05:56:35 build-node airflow: handler.set_context(value)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/utils/log/file_processor_handler.py",
line 65, in set_context
Oct 31 05:56:35 build-node airflow: local_loc = self._init_file(filename)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/utils/log/file_processor_handler.py",
line 141, in _init_file
Oct 31 05:56:35 build-node airflow: os.makedirs(directory)
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
210, in makedirs
Oct 31 05:56:35 build-node airflow: makedirs(head, mode, exist_ok)
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
210, in makedirs
Oct 31 05:56:35 build-node airflow: makedirs(head, mode, exist_ok)
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
210, in makedirs
Oct 31 05:56:35 build-node airflow: makedirs(head, mode, exist_ok)
Oct 31 05:56:35 build-node airflow: [Previous line repeated 3 more times]
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
220, in makedirs
Oct 31 05:56:35 build-node airflow: mkdir(name, mode)
Oct 31 05:56:35 build-node airflow: PermissionError: [Errno 13] Permission
denied: '/var/log/airflow/scheduler/2019-10-31/../../../usr'
{noformat}
The user who execute "airflow scheduler" doesn't have permission to create
folder under /var/log, so it throw a PermissionError. If the create permission
is allowed. the scheduler will generate many log files under the path:
"/var/log/airflow/scheduler/2019-10-31/../../../usr/local/lib/python3.6/site-packages/airflow/example_dags/"
{noformat}
# ls -la
/var/log/airflow/scheduler/2019-10-31/../../../usr/local/lib/python3.6/site-packages/airflow/example_dags/
total 2212
drwxr-xr-x. 3 airflow airflow 4096 Oct 31 06:04 .
drwxr-xr-x. 3 airflow airflow 26 Oct 31 06:04 ..
-rw-r--r--. 1 airflow airflow 90610 Oct 31 06:18 docker_copy_data.py.log
-rw-r--r--. 1 airflow airflow 93636 Oct 31 06:18 example_bash_operator.py.log
-rw-r--r--. 1 airflow airflow 95777 Oct 31 06:18 example_branch_operator.py.log
-rw-r--r--. 1 airflow airflow 50840 Oct 31 06:18
example_branch_python_dop_operator_3.py.log
-rw-r--r--. 1 airflow airflow 93480 Oct 31 06:18 example_docker_operator.py.log
-rw-r--r--. 1 airflow airflow 94792 Oct 31 06:18 example_http_operator.py.log
-rw-r--r--. 1 airflow airflow 93152 Oct 31 06:18 example_latest_only.py.log
-rw-r--r--. 1 airflow airflow 98334 Oct 31 06:18
example_latest_only_with_trigger.py.log
-rw-r--r--. 1 airflow airflow 103648 Oct 31 06:18
example_passing_params_via_test_command.py.log
-rw-r--r--. 1 airflow airflow 93150 Oct 31 06:18 example_pig_operator.py.log
-rw-r--r--. 1 airflow airflow 67744 Oct 31 06:18 example_python_operator.py.log
-rw-r--r--. 1 airflow airflow 49610 Oct 31 06:18
example_short_circuit_operator.py.log
-rw-r--r--. 1 airflow airflow 92332 Oct 31 06:18 example_skip_dag.py.log
-rw-r--r--. 1 airflow airflow 101844 Oct 31 06:18 example_subdag_operator.py.log
-rw-r--r--. 1 airflow airflow 99220 Oct 31 06:18
example_trigger_controller_dag.py.log
-rw-r--r--. 1 airflow airflow 97252 Oct 31 06:18
example_trigger_target_dag.py.log
-rw-r--r--. 1 airflow airflow 90364 Oct 31 06:18 example_xcom.py.log
drwxr-xr-x. 2 airflow airflow 27 Oct 31 06:04 subdags
-rw-r--r--. 1 airflow airflow 55590 Oct 31 06:18 test_utils.py.log
-rw-r--r--. 1 airflow airflow 86240 Oct 31 06:18 tutorial.py.log
{noformat}
The absolute path is
"/var/log/usr/local/lib/python3.6/site-packages/airflow/example_dags/"
I don't know why the scheduler doesn't use the configuration in airflow.cfg,
instead to create another log folder by a relative path.
{noformat}
...
base_log_folder = /var/log/airflow
child_process_log_directory = /var/log/airflow/scheduler
...
{noformat}
That's a {color:#DE350B}BUG{color} i think..., I have also verified version
1.10.6rc1, same behavior.
was (Author: ylhyh):
I'm currently using ver 1.10.5, meet the similar issue:
{noformat}
Oct 31 05:56:35 build-node airflow: Traceback (most recent call last):
Oct 31 05:56:35 build-node airflow: File
"/usr/lib64/python3.6/multiprocessing/process.py", line 258, in _bootstrap
Oct 31 05:56:35 build-node airflow: self.run()
Oct 31 05:56:35 build-node airflow: File
"/usr/lib64/python3.6/multiprocessing/process.py", line 93, in run
Oct 31 05:56:35 build-node airflow: self._target(*self._args, **self._kwargs)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line
128, in _run_file_processor
Oct 31 05:56:35 build-node airflow: set_context(log, file_path)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/utils/log/logging_mixin.py",
line 170, in set_context
Oct 31 05:56:35 build-node airflow: handler.set_context(value)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/utils/log/file_processor_handler.py",
line 65, in set_context
Oct 31 05:56:35 build-node airflow: local_loc = self._init_file(filename)
Oct 31 05:56:35 build-node airflow: File
"/usr/local/lib/python3.6/site-packages/airflow/utils/log/file_processor_handler.py",
line 141, in _init_file
Oct 31 05:56:35 build-node airflow: os.makedirs(directory)
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
210, in makedirs
Oct 31 05:56:35 build-node airflow: makedirs(head, mode, exist_ok)
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
210, in makedirs
Oct 31 05:56:35 build-node airflow: makedirs(head, mode, exist_ok)
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
210, in makedirs
Oct 31 05:56:35 build-node airflow: makedirs(head, mode, exist_ok)
Oct 31 05:56:35 build-node airflow: [Previous line repeated 3 more times]
Oct 31 05:56:35 build-node airflow: File "/usr/lib64/python3.6/os.py", line
220, in makedirs
Oct 31 05:56:35 build-node airflow: mkdir(name, mode)
Oct 31 05:56:35 build-node airflow: PermissionError: [Errno 13] Permission
denied: '/var/log/airflow/scheduler/2019-10-31/../../../usr'
{noformat}
The user who execute "airflow scheduler" doesn't have permission to create
folder under /var/log, so it throw a PermissionError. If the create permission
is allowed. the scheduler will generate many log files under the path:
"/var/log/airflow/scheduler/2019-10-31/../../../usr/local/lib/python3.6/site-packages/airflow/example_dags/"
{noformat}
# ls -la
/var/log/airflow/scheduler/2019-10-31/../../../usr/local/lib/python3.6/site-packages/airflow/example_dags/
total 2212
drwxr-xr-x. 3 airflow airflow 4096 Oct 31 06:04 .
drwxr-xr-x. 3 airflow airflow 26 Oct 31 06:04 ..
-rw-r--r--. 1 airflow airflow 90610 Oct 31 06:18 docker_copy_data.py.log
-rw-r--r--. 1 airflow airflow 93636 Oct 31 06:18 example_bash_operator.py.log
-rw-r--r--. 1 airflow airflow 95777 Oct 31 06:18 example_branch_operator.py.log
-rw-r--r--. 1 airflow airflow 50840 Oct 31 06:18
example_branch_python_dop_operator_3.py.log
-rw-r--r--. 1 airflow airflow 93480 Oct 31 06:18 example_docker_operator.py.log
-rw-r--r--. 1 airflow airflow 94792 Oct 31 06:18 example_http_operator.py.log
-rw-r--r--. 1 airflow airflow 93152 Oct 31 06:18 example_latest_only.py.log
-rw-r--r--. 1 airflow airflow 98334 Oct 31 06:18
example_latest_only_with_trigger.py.log
-rw-r--r--. 1 airflow airflow 103648 Oct 31 06:18
example_passing_params_via_test_command.py.log
-rw-r--r--. 1 airflow airflow 93150 Oct 31 06:18 example_pig_operator.py.log
-rw-r--r--. 1 airflow airflow 67744 Oct 31 06:18 example_python_operator.py.log
-rw-r--r--. 1 airflow airflow 49610 Oct 31 06:18
example_short_circuit_operator.py.log
-rw-r--r--. 1 airflow airflow 92332 Oct 31 06:18 example_skip_dag.py.log
-rw-r--r--. 1 airflow airflow 101844 Oct 31 06:18 example_subdag_operator.py.log
-rw-r--r--. 1 airflow airflow 99220 Oct 31 06:18
example_trigger_controller_dag.py.log
-rw-r--r--. 1 airflow airflow 97252 Oct 31 06:18
example_trigger_target_dag.py.log
-rw-r--r--. 1 airflow airflow 90364 Oct 31 06:18 example_xcom.py.log
drwxr-xr-x. 2 airflow airflow 27 Oct 31 06:04 subdags
-rw-r--r--. 1 airflow airflow 55590 Oct 31 06:18 test_utils.py.log
-rw-r--r--. 1 airflow airflow 86240 Oct 31 06:18 tutorial.py.log
{noformat}
The absolute path is
"/var/log/usr/local/lib/python3.6/site-packages/airflow/example_dags/"
I don't know why the scheduler doesn't use the configuration in airflow.cfg,
instead to create another log folder by a relative path.
{noformat}
...
base_log_folder = /var/log/airflow
child_process_log_directory = /var/log/airflow/scheduler
...
{noformat}
That's b {color:#DE350B}BUG{color} i think..., I have also verified version
1.10.6rc1, same behavior.
> Scheduler: 'airflow scheduler' fails to make opt directory
> -----------------------------------------------------------
>
> Key: AIRFLOW-4719
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4719
> Project: Apache Airflow
> Issue Type: Bug
> Components: scheduler
> Affects Versions: 1.10.3
> Environment: In a fedora 29 Singularity container
> Reporter: Robert Lugg
> Priority: Major
>
> I have a strange error when running `airflow schedule`. Python errors with:
> `Process DagFileProcessor0-Process:
> Traceback (most recent call last):
> File "/usr/lib64/python3.7/multiprocessing/process.py", line 297, in
> _bootstrap
> self.run()
> File "/usr/lib64/python3.7/multiprocessing/process.py", line 99, in run
> self._target(*self._args, **self._kwargs)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/jobs.py", line
> 381, in helper
> set_context(log, file_path)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py",
> line 170, in set_context
> handler.set_context(value)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
> line 65, in set_context
> local_loc = self._init_file(filename)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
> line 141, in _init_file
> os.makedirs(directory)
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in
> makedirs
> makedirs(head, exist_ok=exist_ok)
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in
> makedirs
> makedirs(head, exist_ok=exist_ok)
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in
> makedirs
> makedirs(head, exist_ok=exist_ok)
> [Previous line repeated 5 more times]
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 221, in
> makedirs
> mkdir(name, mode)
> OSError: [Errno 30] Read-only file system:
> '/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow/logs/scheduler/2019-05-31/../../../../../../../opt'
> `
> That very last line shows the problem.
> Airflow is attempting to make a directory one level down from the directory I
> own. `/remote/XXX/rlugg` is my directory, but `/remote/XXX` is not.
> I use AIRFLOW_HOME to point to
> `/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow`
> I've tried changing airflow.cfg as well as setting the environment variable
> `export AIRFLOW___CORE___BASE_LOG_FOLDER=/x/y/z` yet the same error (with the
> exact same directory) is shown.
> I am running within a Singularity container if that's relevant.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)