abhijit-sarkar-infocepts opened a new issue, #45405:
URL: https://github.com/apache/airflow/issues/45405

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### If "Other Airflow 2 version" selected, which one?
   
   2.9.3
   
   ### What happened?
   
   I am deploying airflow 2.9.3 in AKS with Helm Chart 1.15.0. I am using 
CeleryExecutor. For logs I have created a PV and PVC with Storageclass 
**azureblob-fuse-premium**.  But I found that after deploy all pods are getting 
failed when trying to create files in PV. 
   
   Note :: Inside from pods I have tried to generate a files (touch test.txt) 
from the path **/opt/airflow/logs** and it's generated in PV [Azure storage 
account]
   
   ### What you think should happen instead?
   
   I have tried using the PVC with Azure **file-share** driver, the logs are 
getting generated but airflow pods are unable to fetch the log from the PV. 
Because Azure file-share does not allow os.chmod operation. So logs were not 
able to generate. Then I have tried with azureblob-fuse-premium pvc after 
following this document [Azure Airflow 
Document](https://learn.microsoft.com/en-us/azure/aks/airflow-deploy).
   
   **Logs**
   
   Usage: python -m celery [OPTIONS] COMMAND [ARGS]...
   Try 'python -m celery --help' for help.
   
   Error: Invalid value for '-A' / '--app':
   Unable to load celery application.
   While trying to load the module 
airflow.providers.celery.executors.celery_executor.app the following error 
occurred:
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
       self._accessor.mkdir(self, mode)
   FileNotFoundError: [Errno 2] No such file or directory: 
'/opt/airflow/logs/scheduler/2025-01-05'
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
       self._accessor.mkdir(self, mode)
   FileNotFoundError: [Errno 2] No such file or directory: 
'/opt/airflow/logs/scheduler'
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/logging/config.py", line 563, in configure
       handler = self.configure_handler(handlers[name])
     File "/usr/local/lib/python3.8/logging/config.py", line 744, in 
configure_handler
       result = factory(**kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/file_processor_handler.py",
 line 53, in __init__
       Path(self._get_log_directory()).mkdir(parents=True, exist_ok=True)
     File "/usr/local/lib/python3.8/pathlib.py", line 1292, in mkdir
       self.parent.mkdir(parents=True, exist_ok=True)
     File "/usr/local/lib/python3.8/pathlib.py", line 1293, in mkdir
       self.mkdir(mode, parents=False, exist_ok=exist_ok)
     File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
       self._accessor.mkdir(self, mode)
   FileNotFoundError: [Errno 2] No such file or directory: 
'/opt/airflow/logs/scheduler'
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 
59, in symbol_by_name
       module = imp(module_name, package=package, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/celery/utils/imports.py", 
line 109, in import_from_cwd
       return imp(module, package=package)
     File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in 
import_module
       return _bootstrap._gcd_import(name[level:], package, level)
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
     File "<frozen importlib._bootstrap_external>", line 843, in exec_module
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/__init__.py", line 
74, in <module>
       settings.initialize()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py", line 
531, in initialize
       LOGGING_CLASS_PATH = configure_logging()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", 
line 74, in configure_logging
       raise e
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", 
line 69, in configure_logging
       dictConfig(logging_config)
     File "/usr/local/lib/python3.8/logging/config.py", line 808, in dictConfig
       dictConfigClass(config).configure()
     File "/usr/local/lib/python3.8/logging/config.py", line 570, in configure
       raise ValueError('Unable to configure handler '
   ValueError: Unable to configure handler 'processor'
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/celery/bin/celery.py", line 
58, in convert
       return find_app(value)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/celery/app/utils.py", line 
383, in find_app
       sym = symbol_by_name(app, imp=imp)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 
61, in symbol_by_name
       reraise(ValueError,
     File 
"/home/airflow/.local/lib/python3.8/site-packages/kombu/exceptions.py", line 
34, in reraise
       raise value.with_traceback(tb)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 
59, in symbol_by_name
       module = imp(module_name, package=package, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/celery/utils/imports.py", 
line 109, in import_from_cwd
       return imp(module, package=package)
     File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in 
import_module
       return _bootstrap._gcd_import(name[level:], package, level)
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
     File "<frozen importlib._bootstrap>", line 991, in _find_and_load
     File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
     File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
     File "<frozen importlib._bootstrap_external>", line 843, in exec_module
     File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/__init__.py", line 
74, in <module>
       settings.initialize()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py", line 
531, in initialize
       LOGGING_CLASS_PATH = configure_logging()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", 
line 74, in configure_logging
       raise e
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", 
line 69, in configure_logging
       dictConfig(logging_config)
     File "/usr/local/lib/python3.8/logging/config.py", line 808, in dictConfig
       dictConfigClass(config).configure()
     File "/usr/local/lib/python3.8/logging/config.py", line 570, in configure
       raise ValueError('Unable to configure handler '
   ValueError: Couldn't import 
'airflow.providers.celery.executors.celery_executor.app': Unable to configure 
handler 'processor'
   PS C:\AirflowSetup\Prod> kubectl logs airflow-worker-6f9f456bbd-bdfnk -n 
airflow293
   Defaulted container "worker" out of: worker, wait-for-airflow-migrations 
(init)
   ....................
   ERROR! Maximum number of retries (20) reached.
   
   Last check result:
   $ airflow db check
   Unable to load the config, contains a configuration error.
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
       self._accessor.mkdir(self, mode)
   FileNotFoundError: [Errno 2] No such file or directory: 
'/opt/airflow/logs/scheduler/2025-01-05'
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
       self._accessor.mkdir(self, mode)
   FileNotFoundError: [Errno 2] No such file or directory: 
'/opt/airflow/logs/scheduler'
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/usr/local/lib/python3.8/logging/config.py", line 563, in configure
       handler = self.configure_handler(handlers[name])
     File "/usr/local/lib/python3.8/logging/config.py", line 744, in 
configure_handler
       result = factory(**kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/file_processor_handler.py",
 line 53, in __init__
       Path(self._get_log_directory()).mkdir(parents=True, exist_ok=True)
     File "/usr/local/lib/python3.8/pathlib.py", line 1292, in mkdir
       self.parent.mkdir(parents=True, exist_ok=True)
     File "/usr/local/lib/python3.8/pathlib.py", line 1293, in mkdir
       self.mkdir(mode, parents=False, exist_ok=exist_ok)
     File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
       self._accessor.mkdir(self, mode)
   FileNotFoundError: [Errno 2] No such file or directory: 
'/opt/airflow/logs/scheduler'
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
     File "/home/airflow/.local/bin/airflow", line 5, in <module>
       from airflow.__main__ import main
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/__init__.py", line 
74, in <module>
       settings.initialize()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py", line 
531, in initialize
       LOGGING_CLASS_PATH = configure_logging()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", 
line 74, in configure_logging
       raise e
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", 
line 69, in configure_logging
       dictConfig(logging_config)
     File "/usr/local/lib/python3.8/logging/config.py", line 808, in dictConfig
       dictConfigClass(config).configure()
     File "/usr/local/lib/python3.8/logging/config.py", line 570, in configure
       raise ValueError('Unable to configure handler '
   ValueError: Unable to configure handler 'processor'
   
   ### How to reproduce
   
   As I am doing setup of airgap airflow deployment.
   1. I have created a docker image in ACR.
   2. Created PV and PVC.
   3. Used Helm chart to deploy.
   
   ```
   logs:
     # Configuration for empty dir volume (if logs.persistence.enabled == false)
     # emptyDirConfig:
     #   sizeLimit: 1Gi
     #   medium: Memory
     persistence:
       # Enable persistent volume for storing logs
       enabled: true
       # Volume size for logs
       size: 50Gi
       # Annotations for the logs PVC
       annotations: {}
       # If using a custom storageClass, pass name here
       storageClassName: azureblob-fuse-premium
       ## the name of an existing PVC to use
       existingClaim: pvc-airflow-logs-blobfuse
   ```
   
   ### Operating System
   
   Linux-Ubantu
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==8.25.0
   apache-airflow-providers-celery==3.7.2
   apache-airflow-providers-cncf-kubernetes==8.4.2
   apache-airflow-providers-common-io==1.3.2
   apache-airflow-providers-common-sql==1.14.2
   apache-airflow-providers-docker==3.12.2
   apache-airflow-providers-elasticsearch==5.4.1
   apache-airflow-providers-fab==1.2.2
   apache-airflow-providers-ftp==3.10.0
   apache-airflow-providers-google==10.21.0
   apache-airflow-providers-grpc==3.5.2
   apache-airflow-providers-hashicorp==3.7.1
   apache-airflow-providers-http==4.12.0
   apache-airflow-providers-imap==3.6.1
   apache-airflow-providers-microsoft-azure==10.0.0
   apache-airflow-providers-microsoft-winrm==3.4.0
   apache-airflow-providers-mysql==5.6.2
   apache-airflow-providers-odbc==4.6.2
   apache-airflow-providers-openlineage==1.9.1
   apache-airflow-providers-postgres==5.11.2
   apache-airflow-providers-redis==3.7.1
   apache-airflow-providers-sendgrid==3.5.1
   apache-airflow-providers-sftp==4.10.2
   apache-airflow-providers-slack==8.7.1
   apache-airflow-providers-smtp==1.7.1
   apache-airflow-providers-snowflake==4.1.0
   apache-airflow-providers-sqlite==3.8.1
   apache-airflow-providers-ssh==3.11.2
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   Airflow - 2.9.3
   Python - 3.8
   Helm - 1.15.0
   Kubernetes - 1.29.9
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to