[
https://issues.apache.org/jira/browse/AIRFLOW-4719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16855135#comment-16855135
]
Robert Lugg commented on AIRFLOW-4719:
--------------------------------------
I wiped my airflow directory a couple of times and the same problem came up
each time. 'grep' didn't show any "opt" in airflow.cfg.
I installed python in an opt subdirectory (/opt/ABC/anaconda/anaconda_0.0), so
it may be getting it from there (somehow?)
I have since installed anaconda (and airflow) directly on my target system and
it runs without error. The Singularity image which was causing the problem is
still failing on my target system but works when running on the system I built
it on.
I guess its hard to have anything actionable from this bug without me providing
more information, so maybe it should be closed until I narrow it down more??
> Scheduler: 'airflow scheduler' fails to make opt directory
> -----------------------------------------------------------
>
> Key: AIRFLOW-4719
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4719
> Project: Apache Airflow
> Issue Type: Bug
> Components: scheduler
> Affects Versions: 1.10.3
> Environment: In a fedora 29 Singularity container
> Reporter: Robert Lugg
> Priority: Major
>
> I have a strange error when running `airflow schedule`. Python errors with:
> `Process DagFileProcessor0-Process:
> Traceback (most recent call last):
> File "/usr/lib64/python3.7/multiprocessing/process.py", line 297, in
> _bootstrap
> self.run()
> File "/usr/lib64/python3.7/multiprocessing/process.py", line 99, in run
> self._target(*self._args, **self._kwargs)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/jobs.py", line
> 381, in helper
> set_context(log, file_path)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py",
> line 170, in set_context
> handler.set_context(value)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
> line 65, in set_context
> local_loc = self._init_file(filename)
> File
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
> line 141, in _init_file
> os.makedirs(directory)
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in
> makedirs
> makedirs(head, exist_ok=exist_ok)
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in
> makedirs
> makedirs(head, exist_ok=exist_ok)
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in
> makedirs
> makedirs(head, exist_ok=exist_ok)
> [Previous line repeated 5 more times]
> File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 221, in
> makedirs
> mkdir(name, mode)
> OSError: [Errno 30] Read-only file system:
> '/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow/logs/scheduler/2019-05-31/../../../../../../../opt'
> `
> That very last line shows the problem.
> Airflow is attempting to make a directory one level down from the directory I
> own. `/remote/XXX/rlugg` is my directory, but `/remote/XXX` is not.
> I use AIRFLOW_HOME to point to
> `/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow`
> I've tried changing airflow.cfg as well as setting the environment variable
> `export AIRFLOW___CORE___BASE_LOG_FOLDER=/x/y/z` yet the same error (with the
> exact same directory) is shown.
> I am running within a Singularity container if that's relevant.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)