GitHub user menal304 created a discussion: Issue when trying to run Airflow > 
3.0 on WSL 2 Ubuntu 24.04

**Apache Airflow version**
>3.0.0 (Python 3.12.3)

**Deployment**
WSL 2 (Ubuntu-24.04)

**What happened**
I did a clean installation of Airflow 3.1.0:
- `rm -r ~/airflow`
- `python3 -m venv ~/virtualEnv/devEnvAirflow`
- `pip install "apache-airflow[celery]==3.1.0" --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-3.1.0/constraints-3.12.txt"`

When I then start Airflow via `airflow standalone` or each airflow service 
individually I run into trouble when scheduling a DAG.
Once the scheduler is started the tasks are queued but never started. 
The reason is that the scheduler service stops by throwing the following error:

```
(devEnvAirflow) menal304@DESKTOPPC:~$ airflow scheduler
[2025-10-06T09:12:10.703+0200] {providers_manager.py:953} INFO - The hook_class 
'airflow.providers.standard.hooks.filesystem.FSHook' is not fully initialized 
(UI widgets will be missing), because the 'flask_appbuilder' package is not 
installed, however it is not required for Airflow components to work
[2025-10-06T09:12:10.704+0200] {providers_manager.py:953} INFO - The hook_class 
'airflow.providers.standard.hooks.package_index.PackageIndexHook' is not fully 
initialized (UI widgets will be missing), because the 'flask_appbuilder' 
package is not installed, however it is not required for Airflow components to 
work
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
[2025-10-06T09:12:10.781+0200] {workday.py:41} WARNING - Could not import 
pandas. Holidays will not be considered.
[2025-10-06T09:12:10.802+0200] {scheduler_job_runner.py:1022} INFO - Starting 
the scheduler
[2025-10-06 09:12:10 +0200] [2437] [INFO] Starting gunicorn 23.0.0
[2025-10-06T09:12:10.805+0200] {executor_loader.py:269} INFO - Loaded executor: 
:LocalExecutor:
[2025-10-06 09:12:10 +0200] [2437] [INFO] Listening at: http://[::]:8793 (2437)
[2025-10-06 09:12:10 +0200] [2437] [INFO] Using worker: sync
[2025-10-06T09:12:10.810+0200] {scheduler_job_runner.py:2218} INFO - Adopting 
or resetting orphaned tasks for active dag runs
[2025-10-06 09:12:10 +0200] [2438] [INFO] Booting worker with pid: 2438
[2025-10-06 09:12:10 +0200] [2439] [INFO] Booting worker with pid: 2439
[2025-10-06T09:15:30.037+0200] {dag.py:2236} INFO - Setting next_dagrun for 
example_bash_operator to 2025-10-07 00:00:00+00:00, run_after=2025-10-07 
00:00:00+00:00
Dag run  in running state
Dag information Queued at: 2025-10-06 07:15:30.031102+00:00 version: 1
Dag run  in running state
Dag information Queued at: 2025-10-06 07:15:29.957317+00:00 version: 1
[2025-10-06T09:15:30.089+0200] {scheduler_job_runner.py:461} INFO - 10 tasks up 
for execution:
        <TaskInstance: example_bash_operator.runme_0 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_1 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_2 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_0 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_1 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_2 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.also_run_this 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>        <TaskInstance: 
example_bash_operator.this_will_skip scheduled__2025-10-06T00:00:00+00:00 
[scheduled]>
        <TaskInstance: example_bash_operator.also_run_this 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.this_will_skip 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
[2025-10-06T09:15:30.089+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 0/16 running and queued tasks
[2025-10-06T09:15:30.089+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 1/16 running and queued tasks
[2025-10-06T09:15:30.089+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 2/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 0/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 1/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 2/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 3/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 4/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 3/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:533} INFO - DAG 
example_bash_operator has 4/16 running and queued tasks
[2025-10-06T09:15:30.090+0200] {scheduler_job_runner.py:672} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: example_bash_operator.runme_0 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_1 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_2 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_0 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_1 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.runme_2 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.also_run_this 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>        <TaskInstance: 
example_bash_operator.this_will_skip scheduled__2025-10-06T00:00:00+00:00 
[scheduled]>
        <TaskInstance: example_bash_operator.also_run_this 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
        <TaskInstance: example_bash_operator.this_will_skip 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>
[2025-10-06T09:15:30.091+0200] {scheduler_job_runner.py:778} INFO - Trying to 
enqueue tasks: [<TaskInstance: example_bash_operator.runme_0 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>, <TaskInstance: 
example_bash_operator.runme_1 scheduled__2025-10-06T00:00:00+00:00 
[scheduled]>, <TaskInstance: example_bash_operator.runme_2 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>, <TaskInstance: 
example_bash_operator.runme_0 manual__2025-10-06T07:15:29.951225+00:00 
[scheduled]>, <TaskInstance: example_bash_operator.runme_1 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>, <TaskInstance: 
example_bash_operator.runme_2 manual__2025-10-06T07:15:29.951225+00:00 
[scheduled]>, <TaskInstance: example_bash_operator.also_run_this 
scheduled__2025-10-06T00:00:00+00:00 [scheduled]>, <TaskInstance: 
example_bash_operator.this_will_skip scheduled__2025-10-06T00:00:00+00:00 
[scheduled]>, <TaskInstance: example_bash_operator.also_run_this 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>, <Task
 Instance: example_bash_operator.this_will_skip 
manual__2025-10-06T07:15:29.951225+00:00 [scheduled]>] for executor: 
LocalExecutor(parallelism=32)
[2025-10-06T09:15:30.099+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3295
[2025-10-06T09:15:30.102+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3296
[2025-10-06T09:15:30.106+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3297
[2025-10-06T09:15:30.111+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3298
[2025-10-06T09:15:30.117+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3299
[2025-10-06T09:15:30.121+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3300
[2025-10-06T09:15:30.127+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3301
[2025-10-06T09:15:30.135+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3302
[2025-10-06T09:15:30.142+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3303
[2025-10-06T09:15:30.149+0200] {local_executor.py:61} INFO - Worker starting up 
pid=3304
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3295) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3296) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3297) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3298) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
[2025-10-06T09:15:30.259+0200] {_client.py:1025} INFO - HTTP Request: PATCH 
http://localhost:8080/execution/task-instances/0199b860-1a7d-79dd-8ae0-87cde63c2108/run
 "HTTP/1.0 400 Bad Request"
[2025-10-06T09:15:30.259+0200] {_client.py:1025} INFO - HTTP Request: PATCH 
http://localhost:8080/execution/task-instances/0199b860-1a7e-7b73-bb7b-46748585541e/run
 "HTTP/1.0 400 Bad Request"
[2025-10-06T09:15:30.260+0200] {_client.py:1025} INFO - HTTP Request: PATCH 
http://localhost:8080/execution/task-instances/0199b860-1a7c-74cc-8dff-818b145175ef/run
 "HTTP/1.0 400 Bad Request"
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3299) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
2025-10-06 09:15:30 [info     ] Secrets backends loaded for worker [supervisor] 
backend_classes=['EnvironmentVariablesBackend'] count=1
2025-10-06 09:15:30 [info     ] Process exited                 [supervisor] 
exit_code=<Negsignal.SIGKILL: -9> pid=3306 signal_sent=SIGKILL
2025-10-06 09:15:30 [info     ] Process exited                 [supervisor] 
exit_code=<Negsignal.SIGKILL: -9> pid=3307 signal_sent=SIGKILL
2025-10-06 09:15:30 [info     ] Process exited                 [supervisor] 
exit_code=<Negsignal.SIGKILL: -9> pid=3305 signal_sent=SIGKILL
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3300) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py:476
 DeprecationWarning: This process (pid=3301) is multi-threaded, use of fork() 
may lead to deadlocks in the child.
[2025-10-06T09:15:30.267+0200] {local_executor.py:96} ERROR - uhoh
Traceback (most recent call last):
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/executors/local_executor.py",
 line 92, in _run_worker
    _execute_work(log, workload)
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/executors/local_executor.py",
 line 120, in _execute_work
    supervise(
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 1829, in supervise
    process = ActivitySubprocess.start(
              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 933, in start
    proc._on_child_started(ti=what, dag_rel_path=dag_rel_path, 
bundle_info=bundle_info)
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 944, in _on_child_started
    ti_context = self.client.task_instances.start(ti.id, self.pid, start_date)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/api/client.py",
 line 152, in start
    resp = self.client.patch(f"task-instances/{id}/run", 
content=body.model_dump_json())
           
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_client.py",
 line 1218, in patch
    return self.request(
           ^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/tenacity/__init__.py",
 line 338, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/tenacity/__init__.py",
 line 477, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/tenacity/__init__.py",
 line 378, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/tenacity/__init__.py",
 line 400, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in 
__get_result
    raise self._exception
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/tenacity/__init__.py",
 line 480, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/api/client.py",
 line 735, in request
    return super().request(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_client.py",
 line 825, in request
    return self.send(request, auth=auth, follow_redirects=follow_redirects)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_client.py",
 line 914, in send
    response = self._send_handling_auth(
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_client.py",
 line 942, in _send_handling_auth
    response = self._send_handling_redirects(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_client.py",
 line 999, in _send_handling_redirects
    raise exc
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_client.py",
 line 982, in _send_handling_redirects
    hook(response)
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/airflow/sdk/api/client.py",
 line 123, in raise_on_4xx_5xx_with_note
    return get_json_error(response) or response.raise_for_status()
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File 
"/home/menal304/virtualEnv/devEnvAirflow/lib/python3.12/site-packages/httpx/_models.py",
 line 829, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 
'http://localhost:8080/execution/task-instances/0199b860-1a7e-7b73-bb7b-46748585541e/run'
For more information check: 
https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
Correlation-id=0199b860-1b45-7c0a-a43e-b443958571ff
```

**Anything else**
- When running my DAG's with `dag.test()` everything works as expected.
- This error happens for each version above 3.0.0 for versions below 3.0.0 
everything works as expected.
- The WSL settings are set to 20 CPU cores, 8 GB's of RAM.
- I did not change anything in the config file (default via Local Executor).

I know that WSL is not officially supported. I'm running a Prod server on 
Linux. For local development I want to run Airflow using WSL. Does anyone have 
a solution to this problem?
If there is more information needed please let me know.

GitHub link: https://github.com/apache/airflow/discussions/56428

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to