denskh opened a new issue, #29112:
URL: https://github.com/apache/airflow/issues/29112
### Official Helm Chart version
1.7.0 (latest released)
### Apache Airflow version
2.5.1
### Kubernetes Version
1.24.6
### Helm Chart configuration
executor: "KubernetesExecutor" # however same issue happens with
LocalExecutor
logs:
persistence:
enabled: true
size: 50Gi
storageClassName: azurefile-csi
### Docker Image customizations
Using airflow-2.5.1-python3.10 as a base image.
Copy custom shared libraries into folder under /opt/airflow/company
Copy DAGs /opt/airflow/dags
### What happened
After migrating from airflow 2.4.3 to 2.5.1 start getting error below. No
other changes to custom image. No task is running because of this error:
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/__main__.py", line
39, in main
args.func(args)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/cli/cli_parser.py",
line 52, in command
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/cli.py", line
108, in wrapper
return f(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/cli/commands/task_command.py",
line 384, in task_run
ti.init_run_context(raw=args.raw)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py",
line 2414, in init_run_context
self._set_context(self)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/log/logging_mixin.py",
line 77, in _set_context
set_context(self.log, context)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/log/logging_mixin.py",
line 213, in set_context
flag = cast(FileTaskHandler, handler).set_context(value)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/log/file_task_handler.py",
line 71, in set_context
local_loc = self._init_file(ti)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/log/file_task_handler.py",
line 382, in _init_file
self._prepare_log_folder(Path(full_path).parent)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/log/file_task_handler.py",
line 358, in _prepare_log_folder
directory.chmod(mode)
File "/usr/local/lib/python3.10/pathlib.py", line 1191, in chmod
self._accessor.chmod(self, mode, follow_symlinks=follow_symlinks)
PermissionError: [Errno 1] Operation not permitted:
'/opt/airflow/logs/dag_id=***/run_id=manual__2023-01-22T02:59:43.752407+00:00/task_id=***'
### What you think should happen instead
Seem like airflow attempts to set change log folder permissions and not
permissioned to do it.
Getting same error when executing command manually (confirmed folder path
exists): chmod 511
'/opt/airflow/logs/dag_id=***/run_id=manual__2023-01-22T02:59:43.752407+00:00/task_id=***'
chmod: changing permissions of
'/opt/airflow/logs/dag_id=***/run_id=scheduled__2023-01-23T15:30:00+00:00/task_id=***':
Operation not permitted
### How to reproduce
My understanding is that this error happens before any custom code is
executed.
### Anything else
Error happens every time, unable to start any DAG while using airflow 2.5.1
or 2.5.0. Exactly same configuration works with 2.4.3
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]