ahazeemi commented on issue #14896:
URL: https://github.com/apache/airflow/issues/14896#issuecomment-866768004


   > I got the same error: daemonic processes are not allowed to have children 
from celery worker when `CeleryKubernetesExecutor` was used, but this error 
gone when CeleryExecutor was used.
   > I does some search, there’s two options for fixing this:
   > Workaround: set `PYTHONOPTIMIZE=1` to escape the assert.
   > [The possible solution](https://github.com/celery/celery/issues/4525): 
`celery worker -P threads` . But there’s a problem, I can’t set the -P threads 
argument to the airflow command: airflow celery worker
   > 
   > Context:
   > 
   > * apache-airflow==2.0.1
   > * apache-airflow-providers-celery==1.0.1
   > * celery==4.4.7
   > * Python==3.7.10
   > 
   > ```
   > [2021-06-18 08:09:46,393: ERROR/ForkPoolWorker-15] Failed to execute task 
daemonic processes are not allowed to have children.
   > Traceback (most recent call last):
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/celery_executor.py", 
line 116, in _execute_in_fork
   >     args.func(args)
   >   File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
   >     return func(*args, **kwargs)
   >   File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", line 
91, in wrapper
   >     return f(*args, **kwargs)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", 
line 237, in task_run
   >     _run_task_by_selected_method(args, dag, ti)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", 
line 64, in _run_task_by_selected_method
   >     _run_task_by_local_task_job(args, ti)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", 
line 117, in _run_task_by_local_task_job
   >     pool=args.pool,
   >   File "<string>", line 4, in __init__
   >   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/state.py", 
line 433, in _initialize_instance
   >     manager.dispatch.init_failure(self, args, kwargs)
   >   File 
"/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 
70, in __exit__
   >     with_traceback=exc_tb,
   >   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", 
line 182, in raise_
   >     raise exception
   >   File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/state.py", 
line 430, in _initialize_instance
   >     return manager.original_init(*mixed[1:], **kwargs)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/jobs/local_task_job.py", line 
68, in __init__
   >     super().__init__(*args, **kwargs)
   >   File "<string>", line 6, in __init__
   >   File "/usr/local/lib/python3.7/site-packages/airflow/jobs/base_job.py", 
line 97, in __init__
   >     self.executor = executor or ExecutorLoader.get_default_executor()
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/executor_loader.py", 
line 62, in get_default_executor
   >     cls._default_executor = cls.load_executor(executor_name)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/executor_loader.py", 
line 79, in load_executor
   >     return cls.__load_celery_kubernetes_executor()
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/executor_loader.py", 
line 116, in __load_celery_kubernetes_executor
   >     kubernetes_executor = 
import_string(cls.executors[KUBERNETES_EXECUTOR])()
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/kubernetes_executor.py",
 line 421, in __init__
   >     self._manager = multiprocessing.Manager()
   >   File "/usr/local/lib/python3.7/multiprocessing/context.py", line 56, in 
Manager
   >     m.start()
   >   File "/usr/local/lib/python3.7/multiprocessing/managers.py", line 563, 
in start
   >     self._process.start()
   >   File "/usr/local/lib/python3.7/multiprocessing/process.py", line 110, in 
start
   >     'daemonic processes are not allowed to have children'
   > AssertionError: daemonic processes are not allowed to have children
   > [2021-06-18 08:09:46,413: ERROR/ForkPoolWorker-15] Task 
airflow.executors.celery_executor.execute_command[0b6b839e-87cd-4ac4-a61d-d001e848665a]
 raised unexpected: AirflowException('Celery command failed on host: 
airflow-default-worker-3.airflow-worker.default.svc.cluster.local')
   > Traceback (most recent call last):
   >   File "/usr/local/lib/python3.7/site-packages/celery/app/trace.py", line 
412, in trace_task
   >     R = retval = fun(*args, **kwargs)
   >   File "/usr/local/lib/python3.7/site-packages/celery/app/trace.py", line 
704, in __protected_call__
   >     return self.run(*args, **kwargs)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/celery_executor.py", 
line 87, in execute_command
   >     _execute_in_fork(command_to_exec)
   >   File 
"/usr/local/lib/python3.7/site-packages/airflow/executors/celery_executor.py", 
line 98, in _execute_in_fork
   >     raise AirflowException('Celery command failed on host: ' + 
get_hostname())
   > airflow.exceptions.AirflowException: Celery command failed on host: 
airflow-default-worker-3.airflow-worker.default.svc.cluster.local
   > ```
   
   @damon09273 Were you able to resolve this issue?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to