asasisekar opened a new issue, #49887:
URL: https://github.com/apache/airflow/issues/49887

   ### Apache Airflow version
   
   3.0.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   Upgraded Airflow from 2.10 to 3.0.0 and DagProcessorJob getting failed after 
few hours. with below exception
   **OSError: [Errno 24] Too many open files**
   `[2025-04-25T21:30:41.765+0100] {dag_processor_job_runner.py:63} ERROR - 
Exception when executing DagProcessorJob
   Traceback (most recent call last):
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/jobs/dag_processor_job_runner.py",
 line 61, in _execute
       self.processor.run()
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
 line 262, in run
       return self._run_parsing_loop()
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
 line 347, in _run_parsing_loop
       self._start_new_processes()
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
 line 894, in _start_new_processes
       processor = self._create_process(file)
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/manager.py",
 line 876, in _create_process
       return DagFileProcessorProcess.start(
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/dag_processing/processor.py",
 line 245, in start
       proc: Self = super().start(target=target, **kwargs)
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 446, in start
       child_comms, read_msgs = mkpipe()
     File 
"/var/opt/rdos-airflow/venv/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 177, in mkpipe
       rsock, wsock = socketpair()
     File 
"/var/opt/icetools/python/ICEpythonvenv310/python-3.10.1/lib/python3.10/socket.py",
 line 607, in socketpair
       a, b = _socket.socketpair(family, type, proto)
   OSError: [Errno 24] Too many open files` 
   
   
   
   ### What you think should happen instead?
   
   _No response_
   
   ### How to reproduce
   
   ` nohup airflow dag-processor`
   
   DAG Processor Config
   ```
   export AIRFLOW__DAG_PROCESSOR__BUNDLE_REFRESH_CHECK_INTERVAL=10
   export AIRFLOW__DAG_PROCESSOR__MIN_FILE_PROCESS_INTERVAL=120
   export AIRFLOW__DAG_PROCESSOR__PARSING_PROCESSES=1
   export AIRFLOW__DAG_PROCESSOR__REFRESH_INTERVAL=900
   ```
   
   ### Operating System
   
   RHEL 8.8
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   This problem occurs always for every few hours once.
   Number of DAGs less than 10.
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to