radumas commented on issue #51512: URL: https://github.com/apache/airflow/issues/51512#issuecomment-3059000919
I think this might be related to https://github.com/apache/airflow/issues/51624, which is queued up for 3.0.3 release We just had a seemingly state mismatch error fail and this was the output in the api_server log ``` [2025-07-10T16:17:52.307-0400] {local_executor.py:96} ERROR - uhoh Traceback (most recent call last): File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/airflow/executors/local_executor.py", line 92, in _run_worker _execute_work(log, workload) File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/airflow/executors/local_executor.py", line 120, in _execute_work supervise( File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/airflow/sdk/execution_time/supervisor.py", line 1509, in supervise client = Client(base_url=server or "", limits=limits, dry_run=dry_run, token=token) File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/airflow/sdk/api/client.py", line 679, in __init__ super().__init__( File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_client.py", line 685, in __init__ self._transport = self._init_transport( File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_client.py", line 733, in _init_transport return HTTPTransport( File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_transports/default.py", line 136, in __init__ File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_config.py", line 53, in create_ssl_context File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_config.py", line 77, in __init__ File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_config.py", line 89, in load_ssl_context File "/home/airflow/airflow_venv/lib64/python3.9/site-packages/httpx/_config.py", line 147, in load_ssl_context_verify OSError: [Errno 24] Too many open files [2025-07-10T16:17:52.464-0400] {scheduler_job_runner.py:810} INFO - Received executor event with state failed for task instance TaskInstanceKey(dag_id='dag_name', task_id='task', run_id='manual__2025-07-10T20:17:40.867464+00:00', try_number=1, map_index=-1) [2025-07-10T16:17:52.468-0400] {scheduler_job_runner.py:852} INFO - TaskInstance Finished: dag_id=dag_name, task_id=task, run_id=manual__2025-0 7-10T20:17:40.867464+00:00, map_index=-1, run_start_date=None, run_end_date=None, run_duration=None, state=queued, executor=LocalExecutor(parallelism=64), executor_state=failed, try_number=1, max_tries=0, pool=default_pool, queue=default, priority_weight=8, operator=_PythonDecoratedOperator, queued_dttm=2025-07-10 20:17:52.079757+00:00, scheduled_dttm=2025-07-10 20:17:51.980506+00:00,queued_by_job_id=889674, pid=None [2025-07-10T16:17:52.469-0400] {scheduler_job_runner.py:925} ERROR - DAG 'dag_name' for task instance <TaskInstance: task manual__2025-07-10T20:17:40.867464+00:00 [queued]> not found in serialized_dag table [2025-07-10T16:17:52.469-0400] {taskinstance.py:1882} ERROR - Executor LocalExecutor(parallelism=64) reported that the task instance <TaskInstance: task manual__2025-07-10T20:17:40.867464+00:00 [queued]> finished with state failed, but the task instance's state attribute is queued. Learn more: https:// airflow.apache.org/docs/apache-airflow/stable/troubleshooting.html#task-state-changed-externally Task instance in failure state Task instance's state was changed through the API. ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
