dstandish opened a new issue #13263:
URL: https://github.com/apache/airflow/issues/13263
CeleryKubernetesExecutor is broken
I found a number of issues:
* was not propagating job_id to K8sExecutor
* did not have a `slots_available` property
After fixing the above, now scheduler runs.
But celery worker has issues.
I get error `Failed to execute task daemonic processes are not allowed to
have children.`
```
[2020-12-22 20:38:42,532: INFO/MainProcess] Connected to
redis://:**@airflow-redis:6379/0
[2020-12-22 20:38:42,546: INFO/MainProcess] mingle: searching for neighbors
[2020-12-22 20:38:43,545: INFO/MainProcess] mingle: all alone
[2020-12-22 20:38:43,581: INFO/MainProcess] celery@airflow-worker-0 ready.
[2020-12-22 20:38:43,590: INFO/MainProcess] Received task:
airflow.executors.celery_executor.execute_command[81f701fd-e379-4ff7-9b20-e6c88123a3cb]
[2020-12-22 20:38:43,596: INFO/MainProcess] Received task:
airflow.executors.celery_executor.execute_command[9d6bf5eb-fbde-4b13-a171-27d6e8e1ee43]
[2020-12-22 20:38:43,600: INFO/MainProcess] Received task:
airflow.executors.celery_executor.execute_command[736f0f62-34f2-4ae4-92d5-e88ebf771c16]
[2020-12-22 20:38:43,606: INFO/MainProcess] Received task:
airflow.executors.celery_executor.execute_command[ce2e8872-aac2-4463-a9e0-1c8dbe607bee]
[2020-12-22 20:38:43,615: INFO/MainProcess] Events of group {task} enabled
by remote.
[2020-12-22 20:38:43,726: INFO/ForkPoolWorker-1] Executing command in
Celery: ['airflow', 'tasks', 'run', 'my_example_bash_operator', 'runme_0',
'2020-12-22T20:38:01.271670+00:00', '--local', '--pool', 'default_pool',
'--subdir', '/opt/airflow/dags/test_dag.py']
[2020-12-22 20:38:43,746: INFO/ForkPoolWorker-2] Executing command in
Celery: ['airflow', 'tasks', 'run', 'my_example_bash_operator', 'runme_1',
'2020-12-22T20:38:01.271670+00:00', '--local', '--pool', 'default_pool',
'--subdir', '/opt/airflow/dags/test_dag.py']
[2020-12-22 20:38:43,762: INFO/ForkPoolWorker-7] Executing command in
Celery: ['airflow', 'tasks', 'run', 'my_example_bash_operator', 'runme_2',
'2020-12-22T20:38:01.271670+00:00', '--local', '--pool', 'default_pool',
'--subdir', '/opt/airflow/dags/test_dag.py']
[2020-12-22 20:38:43,769: INFO/ForkPoolWorker-8] Executing command in
Celery: ['airflow', 'tasks', 'run', 'my_example_bash_operator',
'also_run_this', '2020-12-22T20:38:01.271670+00:00', '--local', '--pool',
'default_pool', '--subdir', '/opt/airflow/dags/test_dag.py']
[2020-12-22 20:38:44,055: INFO/ForkPoolWorker-8] Filling up the DagBag from
/opt/airflow/dags/test_dag.py
[2020-12-22 20:38:44,085: INFO/ForkPoolWorker-2] Filling up the DagBag from
/opt/airflow/dags/test_dag.py
[2020-12-22 20:38:44,091: INFO/ForkPoolWorker-1] Filling up the DagBag from
/opt/airflow/dags/test_dag.py
[2020-12-22 20:38:44,142: INFO/ForkPoolWorker-7] Filling up the DagBag from
/opt/airflow/dags/test_dag.py
[2020-12-22 20:38:44,324: WARNING/ForkPoolWorker-8] Running <TaskInstance:
my_example_bash_operator.also_run_this 2020-12-22T20:38:01.271670+00:00
[queued]> on host airflow-worker-0.airflow-worker.airflow.svc.cluster.local
[2020-12-22 20:38:44,406: WARNING/ForkPoolWorker-2] Running <TaskInstance:
my_example_bash_operator.runme_1 2020-12-22T20:38:01.271670+00:00 [queued]> on
host airflow-worker-0.airflow-worker.airflow.svc.cluster.local
[2020-12-22 20:38:44,438: WARNING/ForkPoolWorker-1] Running <TaskInstance:
my_example_bash_operator.runme_0 2020-12-22T20:38:01.271670+00:00 [queued]> on
host airflow-worker-0.airflow-worker.airflow.svc.cluster.local
[2020-12-22 20:38:44,495: ERROR/ForkPoolWorker-8] Failed to execute task
daemonic processes are not allowed to have children.
[2020-12-22 20:38:44,543: ERROR/ForkPoolWorker-8] Task
airflow.executors.celery_executor.execute_command[9d6bf5eb-fbde-4b13-a171-27d6e8e1ee43]
raised unexpected: AirflowException('Celery command failed on host:
airflow-worker-0.airflow-worker.airflow.svc.cluster.local')
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.7/site-packages/celery/app/trace.py", line
412, in trace_task
R = retval = fun(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/celery/app/trace.py", line
704, in __protected_call__
return self.run(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/celery_executor.py",
line 87, in execute_command
_execute_in_fork(command_to_exec)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/celery_executor.py",
line 98, in _execute_in_fork
raise AirflowException('Celery command failed on host: ' +
get_hostname())
airflow.exceptions.AirflowException: Celery command failed on host:
airflow-worker-0.airflow-worker.airflow.svc.cluster.local
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]