geoffreylarnold opened a new issue, #56477:
URL: https://github.com/apache/airflow/issues/56477

   ### Apache Airflow version
   
   3.1.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   The python scheduler task does not seem to be finding my systems local 
address when starting up the log in `airflow scheduler`.
   I have tested two recommended settings for `hostname_callable` in the config 
file (`airflow.utils.net.get_host_ip_address` and `socket.getfqdn`) and both 
result in the same behavior. And confirmed the change in the setting with 
`airflow config get-value core hostname_callable`
   
   Because of this no tasks can run in the airflow instance.
   
   Error:
   ```[2025-10-08T13:50:35.242333Z] {core.py:50} INFO - Starting log server on 
http://[::]:8793
   [2025-10-08T13:50:35.248641Z] {scheduler_job_runner.py:1018} INFO - Starting 
the scheduler
   [2025-10-08T13:50:35.251945Z] {executor_loader.py:281} INFO - Loaded 
executor: :LocalExecutor:
   [2025-10-08T13:50:35.256073Z] {scheduler_job_runner.py:2240} INFO - Adopting 
or resetting orphaned tasks for active dag runs
   WARNING:  ASGI app factory detected. Using it, but please consider setting 
the --factory flag explicitly.
   INFO:     Started server process [674660]
   INFO:     Waiting for application startup.
   INFO:     Application startup complete.
   INFO:     Uvicorn running on http://:8793 (Press CTRL+C to quit)
   [2025-10-08T13:51:24.493282Z] {scheduler_job_runner.py:1720} INFO - DAG 
Waze_Data_Feed is at (or above) max_active_runs (1 of 1), not creating any more 
runs
   [2025-10-08T13:51:24.572908Z] {scheduler_job_runner.py:419} INFO - 1 tasks 
up for execution:
           <TaskInstance: Waze_Data_Feed.data_pull 
scheduled__2025-10-08T13:50:00+00:00 [scheduled]>
   [2025-10-08T13:51:24.573145Z] {scheduler_job_runner.py:491} INFO - DAG 
Waze_Data_Feed has 0/16 running and queued tasks
   [2025-10-08T13:51:24.573670Z] {scheduler_job_runner.py:630} INFO - Setting 
the following tasks to queued state:
           <TaskInstance: Waze_Data_Feed.data_pull 
scheduled__2025-10-08T13:50:00+00:00 [scheduled]>
   [2025-10-08T13:51:24.576911Z] {scheduler_job_runner.py:715} INFO - Trying to 
enqueue tasks: [<TaskInstance: Waze_Data_Feed.data_pull 
scheduled__2025-10-08T13:50:00+00:00 [scheduled]>] for executor: 
LocalExecutor(parallelism=32)
   [2025-10-08T13:51:24.601266Z] {local_executor.py:65} INFO - Worker starting 
up pid=675283
   [2025-10-08T13:51:24.612964Z] {local_executor.py:65} INFO - Worker starting 
up pid=675284
   [2025-10-08T13:51:24.679010Z] {supervisor.py:1870} INFO - Secrets backends 
loaded for worker count=1 backend_classes=['EnvironmentVariablesBackend']
   
/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=675283) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
   [2025-10-08T13:51:24.706402Z] {before.py:42} WARNING - Starting call to 
'airflow.sdk.api.client.Client.request', this is the 1st time calling it.
   [2025-10-08T13:51:25.708321Z] {before.py:42} WARNING - Starting call to 
'airflow.sdk.api.client.Client.request', this is the 2nd time calling it.
   [2025-10-08T13:51:27.631341Z] {before.py:42} WARNING - Starting call to 
'airflow.sdk.api.client.Client.request', this is the 3rd time calling it.
   [2025-10-08T13:51:30.964791Z] {before.py:42} WARNING - Starting call to 
'airflow.sdk.api.client.Client.request', this is the 4th time calling it.
   [2025-10-08T13:51:38.037899Z] {supervisor.py:709} INFO - Process exited 
pid=675285 exit_code=<Negsignal.SIGKILL: -9> signal_sent=SIGKILL
   [2025-10-08T13:51:38.038389Z] {local_executor.py:100} ERROR - uhoh
   Traceback (most recent call last):
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpx/_transports/default.py",
 line 101, in map_httpcore_exceptions
       yield
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpx/_transports/default.py",
 line 250, in handle_request
       resp = self._pool.handle_request(req)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py",
 line 256, in handle_request
       raise exc from None
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py",
 line 236, in handle_request
       response = connection.handle_request(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_sync/connection.py",
 line 101, in handle_request
       raise exc
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_sync/connection.py",
 line 78, in handle_request
       stream = self._connect(request)
                ^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_sync/connection.py",
 line 124, in _connect
       stream = self._network_backend.connect_tcp(**kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_backends/sync.py",
 line 207, in connect_tcp
       with map_exceptions(exc_map):
     File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
       self.gen.throw(value)
     File 
"/home/mgradmin/airflow/airflow_env/lib/python3.12/site-packages/httpcore/_exceptions.py",
 line 14, in map_exceptions
       raise to_exc(exc) from exc
   httpcore.ConnectError: [Errno 111] Connection refused
   ```
   
   ### What you think should happen instead?
   
   The log server should display the proper hostname or ip address instead of 
`[::]` as it is currently doing, and tasks should be able to write to the log.
   
   ### How to reproduce
   
   From what I've read from other issues pertaining to this issue in the past 
it appears to be specific to the enviornment in which this instance is deployed.
   
   ### Operating System
   
   Ubuntu 24.04.3 LTS
   
   ### Versions of Apache Airflow Providers
   
   ```apache-airflow-providers-common-compat==1.7.4
   apache-airflow-providers-common-io==1.6.3
   apache-airflow-providers-common-sql==1.28.1
   apache-airflow-providers-docker==4.4.3
   apache-airflow-providers-fab==3.0.0
   apache-airflow-providers-ftp==3.13.2
   apache-airflow-providers-http==5.3.4
   apache-airflow-providers-imap==3.9.2
   apache-airflow-providers-jdbc==5.2.3
   apache-airflow-providers-microsoft-mssql==4.3.2
   apache-airflow-providers-odbc==4.10.2
   apache-airflow-providers-oracle==4.2.0
   apache-airflow-providers-postgres==6.3.0
   apache-airflow-providers-slack==9.3.0
   apache-airflow-providers-smtp==2.2.1
   apache-airflow-providers-ssh==4.1.4
   apache-airflow-providers-standard==1.8.0
   apache-airflow-providers-tableau==5.2.0```
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   python venv pip installation
   
   ### Anything else?
   
   This occurs no matter the DAG or Task that is being started by the scheduler
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to