kaxil opened a new pull request #16700:
URL: https://github.com/apache/airflow/pull/16700


   closes https://github.com/apache/airflow/issues/16326
   
   Currently when running celery tasks when running with 
``CeleryKubernetesExecutor``,
   we see the following error. This error occurs as the ``BaseJob`` (via 
``LocalTaskJob``) tries to needlessly
   instantiate a `KubernetesExecutor` which in turn tries to create a 
multiprocessing process/Manager
   which fails.
   
   ```
   [2021-06-29 00:23:45,301: ERROR/ForkPoolWorker-16] Failed to execute task 
daemonic processes are not allowed to have children.
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/celery_executor.py",
 line 116, in _execute_in_fork
       args.func(args)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 
91, in wrapper
       return f(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/task_command.py",
 line 237, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/task_command.py",
 line 64, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/task_command.py",
 line 117, in _run_task_by_local_task_job
       pool=args.pool,
     File "<string>", line 4, in __init__
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/state.py", 
line 433, in _initialize_instance
       manager.dispatch.init_failure(self, args, kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py",
 line 70, in __exit__
       with_traceback=exc_tb,
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", 
line 182, in raise_
       raise exception
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/state.py", 
line 430, in _initialize_instance
       return manager.original_init(*mixed[1:], **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/local_task_job.py",
 line 76, in __init__
       super().__init__(*args, **kwargs)
     File "<string>", line 6, in __init__
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/base_job.py", 
line 97, in __init__
       self.executor = executor or ExecutorLoader.get_default_executor()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/executor_loader.py",
 line 62, in get_default_executor
       cls._default_executor = cls.load_executor(executor_name)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/executor_loader.py",
 line 79, in load_executor
       return cls.__load_celery_kubernetes_executor()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/executor_loader.py",
 line 116, in __load_celery_kubernetes_executor
       kubernetes_executor = import_string(cls.executors[KUBERNETES_EXECUTOR])()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/kubernetes_executor.py",
 line 421, in __init__
       self._manager = multiprocessing.Manager()
     File "/usr/local/lib/python3.6/multiprocessing/context.py", line 56, in 
Manager
       m.start()
     File "/usr/local/lib/python3.6/multiprocessing/managers.py", line 513, in 
start
       self._process.start()
     File "/usr/local/lib/python3.6/multiprocessing/process.py", line 103, in 
start
       'daemonic processes are not allowed to have children'
   AssertionError: daemonic processes are not allowed to have children
   ```
   
   We don't need to instantiate an executor when running ``LocalTaskJob`` as 
executor isn't used in it.
   
   <!--
   Thank you for contributing! Please make sure that your code changes
   are covered with tests. And in case of new features or big changes
   remember to adjust the documentation.
   
   Feel free to ping committers for the review!
   
   In case of existing issue, reference it using one of the following:
   
   closes: #ISSUE
   related: #ISSUE
   
   How to write a good git commit message:
   http://chris.beams.io/posts/git-commit/
   -->
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to