[ 
https://issues.apache.org/jira/browse/AIRFLOW-4719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Lugg updated AIRFLOW-4719:
---------------------------------
    Description: 
I have a strange error when running `airflow schedule`.  Python errors with:

    `Process DagFileProcessor0-Process:
     Traceback (most recent call last):
       File "/usr/lib64/python3.7/multiprocessing/process.py", line 297, in 
_bootstrap
         self.run()
       File "/usr/lib64/python3.7/multiprocessing/process.py", line 99, in run
         self._target(*self._args, **self._kwargs)
       File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/jobs.py", line 
381, in helper
         set_context(log, file_path)
       File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py",
 line 170, in set_context
         handler.set_context(value)
       File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
 line 65, in set_context
         local_loc = self._init_file(filename)
       File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
 line 141, in _init_file
         os.makedirs(directory)
       File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
makedirs
         makedirs(head, exist_ok=exist_ok)
       File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
makedirs
         makedirs(head, exist_ok=exist_ok)
       File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
makedirs
         makedirs(head, exist_ok=exist_ok)
       [Previous line repeated 5 more times]
       File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 221, in 
makedirs
         mkdir(name, mode)
     OSError: [Errno 30] Read-only file system: 
'/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow/logs/scheduler/2019-05-31/../../../../../../../opt'
     `

That very last line shows the problem.

Airflow is attempting to make a directory one level down from the directory I 
own. `/remote/XXX/rlugg` is my directory, but `/remote/XXX` is not.

I use AIRFLOW_HOME to point to 
`/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow`
 I've tried changing airflow.cfg as well as setting the environment variable 
`export AIRFLOW___CORE___BASE_LOG_FOLDER=/x/y/z` yet the same error (with the 
exact same directory) is shown.

I am running within a Singularity container if that's relevant.

  was:
I have a strange error when running `airflow schedule`.  Python errors with:

    `Process DagFileProcessor0-Process:
    Traceback (most recent call last):
      File "/usr/lib64/python3.7/multiprocessing/process.py", line 297, in 
_bootstrap
        self.run()
      File "/usr/lib64/python3.7/multiprocessing/process.py", line 99, in run
        self._target(*self._args, **self._kwargs)
      File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/jobs.py", line 
381, in helper
        set_context(log, file_path)
      File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py",
 line 170, in set_context
        handler.set_context(value)
      File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
 line 65, in set_context
        local_loc = self._init_file(filename)
      File 
"/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
 line 141, in _init_file
        os.makedirs(directory)
      File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
makedirs
        makedirs(head, exist_ok=exist_ok)
      File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
makedirs
        makedirs(head, exist_ok=exist_ok)
      File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
makedirs
        makedirs(head, exist_ok=exist_ok)
      [Previous line repeated 5 more times]
      File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 221, in 
makedirs
        mkdir(name, mode)
    OSError: [Errno 30] Read-only file system: 
'/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow/logs/scheduler/2019-05-31/../../../../../../../opt'
    `

That very last line shows the problem.

Airflow is attempting to make a directory one level down from the directory I 
own. `/remote/XXX/rlugg` is my directory, but `/remote/XXX` is not.

I use AIRFLOW_HOME to point to 
`/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow`
I've tried changing airflow.cfg as well as setting the environment variable 
`export AIRFLOW__CORE__BASE_LOG_FOLDER=/x/y/z` yet the same error (with the 
exact same directory) is shown.

I am running within a Singularity container if that's relevant.


> Scheduler:  'airflow scheduler' fails to make opt directory
> -----------------------------------------------------------
>
>                 Key: AIRFLOW-4719
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-4719
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: scheduler
>    Affects Versions: 1.10.3
>         Environment: In a fedora 29 Singularity container
>            Reporter: Robert Lugg
>            Priority: Major
>
> I have a strange error when running `airflow schedule`.  Python errors with:
>     `Process DagFileProcessor0-Process:
>      Traceback (most recent call last):
>        File "/usr/lib64/python3.7/multiprocessing/process.py", line 297, in 
> _bootstrap
>          self.run()
>        File "/usr/lib64/python3.7/multiprocessing/process.py", line 99, in run
>          self._target(*self._args, **self._kwargs)
>        File 
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/jobs.py", line 
> 381, in helper
>          set_context(log, file_path)
>        File 
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py",
>  line 170, in set_context
>          handler.set_context(value)
>        File 
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
>  line 65, in set_context
>          local_loc = self._init_file(filename)
>        File 
> "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py",
>  line 141, in _init_file
>          os.makedirs(directory)
>        File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
> makedirs
>          makedirs(head, exist_ok=exist_ok)
>        File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
> makedirs
>          makedirs(head, exist_ok=exist_ok)
>        File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in 
> makedirs
>          makedirs(head, exist_ok=exist_ok)
>        [Previous line repeated 5 more times]
>        File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 221, in 
> makedirs
>          mkdir(name, mode)
>      OSError: [Errno 30] Read-only file system: 
> '/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow/logs/scheduler/2019-05-31/../../../../../../../opt'
>      `
> That very last line shows the problem.
> Airflow is attempting to make a directory one level down from the directory I 
> own. `/remote/XXX/rlugg` is my directory, but `/remote/XXX` is not.
> I use AIRFLOW_HOME to point to 
> `/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow`
>  I've tried changing airflow.cfg as well as setting the environment variable 
> `export AIRFLOW___CORE___BASE_LOG_FOLDER=/x/y/z` yet the same error (with the 
> exact same directory) is shown.
> I am running within a Singularity container if that's relevant.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to