ivanrezic-maistra opened a new issue #14896:
URL: https://github.com/apache/airflow/issues/14896
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the
following questions.
Don't worry if they're not all applicable; just try to include what you can
:-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the
context.
-->
**Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl
version`): /
**Environment**:
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04
- **Kernel** (e.g. `uname -a`): Linux ivan-pc 5.4.0-66-generic
#74~18.04.2-Ubuntu SMP Fri Feb 5 11:17:31 UTC 2021 x86_64 x86_64 x86_64
GNU/Linux
**What happened**:
I am using LocalExecutor, and I was using it on Apache/Airflow 1.10.12 the
same way. I mean I had one PythonOperator which runs python method which runs
multiprocessing job using ProcessPoolExecutor (concurrent.futures). And on
earlier version it ran successfully without any problems, but now I get this
error:
```
[2021-03-18 15:38:37,552] {taskinstance.py:1455} ERROR - daemonic processes
are not allowed to have children
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
line 1112, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
line 1285, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
line 1315, in _execute_task
result = task_copy.execute(context=context)
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/operators/python.py",
line 117, in execute
return_value = self.execute_callable()
File
"/home/airflow/.local/lib/python3.6/site-packages/airflow/operators/python.py",
line 128, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File
"/home/airflow/Edmond/edmond/backend/main_parts/run_algorithm_o_book.py", line
15, in run_algorithm_o_book
alg_o_output = run_o(k_output, capacity, OModelBook, config)
File "/home/airflow/Edmond/edmond/models/O/model.py", line 388, in run_o
for mid_result in executor.map(_run, args):
File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 496,
in map
timeout=timeout)
File "/usr/local/lib/python3.6/concurrent/futures/_base.py", line 575, in
map
fs = [self.submit(fn, *args) for args in zip(*iterables)]
File "/usr/local/lib/python3.6/concurrent/futures/_base.py", line 575, in
<listcomp>
fs = [self.submit(fn, *args) for args in zip(*iterables)]
File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 466,
in submit
self._start_queue_management_thread()
File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 427,
in _start_queue_management_thread
self._adjust_process_count()
File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 446,
in _adjust_process_count
p.start()
File "/usr/local/lib/python3.6/multiprocessing/process.py", line 103, in
start
'daemonic processes are not allowed to have children'
AssertionError: daemonic processes are not allowed to have children
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
I expected it to run as it was running on Airflow 1.10.12
<!-- What do you think went wrong? -->
**How to reproduce it**:
Run airflow using docker-compose like this:
```
version: '3.8'
x-airflow-common:
&airflow-common
image: edmond_image
env_file:
- compose-services.env
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./scripts:/opt/airflow/scripts
- ./notebooks:/home/airflow/Edmond/notebooks
- ./data:/home/airflow/Edmond/data
depends_on:
- postgres
restart: always
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
airflow-webserver:
<<: *airflow-common
command: webserver
ports:
- 8090:8080
airflow-scheduler:
<<: *airflow-common
command: scheduler
airflow-init:
<<: *airflow-common
restart: on-failure
environment:
_AIRFLOW_DB_UPGRADE: 'true'
_AIRFLOW_WWW_USER_CREATE: 'true'
_AIRFLOW_WWW_USER_USERNAME: 'Admin'
_AIRFLOW_WWW_USER_PASSWORD: 'Admin'
```
And inside run python operator which runs ProcessPoolExecutor from
concurrent.futures
<!---
As minimally and precisely as possible. Keep in mind we do not have access
to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using
minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a
youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an
unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
This problem occurs every time I run python operator with multiprocessing. I
have searched everywhere without any luck. There seems to be a similar error
when using Celery Executor but it doesn't help (as I am using LocalExecutor)
and there is no import collisions.
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]