zhbdesign opened a new issue #9828: URL: https://github.com/apache/airflow/issues/9828
**Apache Airflow version**:1.10.11 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**:centos7 - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): NAME="CentOS Linux" VERSION="7 (Core)" ID="centos" ID_LIKE="rhel fedora" VERSION_ID="7" PRETTY_NAME="CentOS Linux 7 (Core)" ANSI_COLOR="0;31" CPE_NAME="cpe:/o:centos:centos:7" HOME_URL="https://www.centos.org/" BUG_REPORT_URL="https://bugs.centos.org/" CENTOS_MANTISBT_PROJECT="CentOS-7" CENTOS_MANTISBT_PROJECT_VERSION="7" REDHAT_SUPPORT_PRODUCT="centos" REDHAT_SUPPORT_PRODUCT_VERSION="7" - **Kernel** (e.g. `uname -a`):Linux hadoop-node3 3.10.0-957.el7.x86_64 #1 SMP Thu Nov 8 23:39:32 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**:python3.7, pip install apache-airflow[all]==1.10.11 - **Others**: **What happened**: I use a cluster of four machine components. When I execute the task, the task has been distributed, but there will be errors for each machine. The log is as follows: [2020-07-15 11:25:46,471: ERROR/ForkPoolWorker-1] None [2020-07-15 11:25:46,582: ERROR/ForkPoolWorker-2] Task airflow.executors.celery_executor.execute_command[c29ab0dd-7049-4aeb-9023-cda45b9d3462] raised unexpected: AirflowException('Celery command failed',) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 78, in execute_command close_fds=True, env=env) File "/usr/local/lib/python3.6/subprocess.py", line 291, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['airflow', 'run', 'user_MySql_2_ClickHouse_increment_srt_Activity', 'm2ctask_Homework_SubmitActivity_Member_inc', '2020-07-15T11:25:35.177616+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py']' returned non-zero exit status 1. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 412, in trace_task R = retval = fun(*args, **kwargs) File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 704, in protected_call return self.run(*args, **kwargs) File "/usr/local/lib/python3.6/site-packages/sentry_sdk/integrations/celery.py", line 171, in _inner reraise(*exc_info) File "/usr/local/lib/python3.6/site-packages/sentry_sdk/_compat.py", line 57, in reraise raise value File "/usr/local/lib/python3.6/site-packages/sentry_sdk/integrations/celery.py", line 166, in _inner return f(*args, **kwargs) File "/usr/local/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 83, in execute_command raise AirflowException('Celery command failed') airflow.exceptions.AirflowException: Celery command failed [2020-07-15 11:25:46,705: ERROR/ForkPoolWorker-1] Task airflow.executors.celery_executor.execute_command[efcd61c3-bae5-43a6-a2ba-ff584ee5a9e9] raised unexpected: AirflowException('Celery command failed',) Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 78, in execute_command close_fds=True, env=env) File "/usr/local/lib/python3.6/subprocess.py", line 291, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['airflow', 'run', 'user_MySql_2_ClickHouse_increment_srt_Activity', 'MySql_2_ClickHouse_Activity_Category_inc', '2020-07-15T11:25:35.177616+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py']' returned non-zero exit status 1. Check the scheduler to see the following log: [2020-07-15 14:30:40,779] {scheduler_job.py:958} INFO - 2 tasks up for execution: <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc 2020-07-15 14:30:36.508244+00:00 [scheduled]> <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc 2020-07-15 14:30:36.508244+00:00 [scheduled]> [2020-07-15 14:30:40,891] {scheduler_job.py:989} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 2 task instances ready to be queued [2020-07-15 14:30:40,892] {scheduler_job.py:1017} INFO - DAG user_MySql_2_ClickHouse_increment_srt_Activity has 0/31 running and queued tasks [2020-07-15 14:30:40,892] {scheduler_job.py:1017} INFO - DAG user_MySql_2_ClickHouse_increment_srt_Activity has 1/31 running and queued tasks [2020-07-15 14:30:40,901] {scheduler_job.py:1067} INFO - Setting the following tasks to queued state: <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc 2020-07-15 14:30:36.508244+00:00 [scheduled]> <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc 2020-07-15 14:30:36.508244+00:00 [scheduled]> [2020-07-15 14:30:40,925] {scheduler_job.py:1141} INFO - Setting the following 2 tasks to queued state: <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc 2020-07-15 14:30:36.508244+00:00 [queued]> <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc 2020-07-15 14:30:36.508244+00:00 [queued]> [2020-07-15 14:30:40,925] {scheduler_job.py:1177} INFO - Sending ('user_MySql_2_ClickHouse_increment_srt_Activity', 'MySql_2_ClickHouse_Activity_Activity_Discuss_inc', datetime.datetime(2020, 7, 15, 14, 30, 36, 508244, tzinfo=<Timezone [UTC]>), 1) to executor with priority 1 and queue default [2020-07-15 14:30:40,926] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'user_MySql_2_ClickHouse_increment_srt_Activity', 'MySql_2_ClickHouse_Activity_Activity_Discuss_inc', '2020-07-15T14:30:36.508244+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py'] [2020-07-15 14:30:40,926] {scheduler_job.py:1177} INFO - Sending ('user_MySql_2_ClickHouse_increment_srt_Activity', 'MySql_2_ClickHouse_Activity_Activity_inc', datetime.datetime(2020, 7, 15, 14, 30, 36, 508244, tzinfo=<Timezone [UTC]>), 1) to executor with priority 1 and queue default [2020-07-15 14:30:40,927] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'user_MySql_2_ClickHouse_increment_srt_Activity', 'MySql_2_ClickHouse_Activity_Activity_inc', '2020-07-15T14:30:36.508244+00:00', '--local', '--pool', 'default_pool', '-sd', '/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py'] [2020-07-15 14:30:51,338] {scheduler_job.py:1316} INFO - Executor reports execution of user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc execution_date=2020-07-15 14:30:36.508244+00:00 exited with status failed for try_number 1 [2020-07-15 14:30:51,356] {scheduler_job.py:1333} ERROR - Executor reports task instance <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc 2020-07-15 14:30:36.508244+00:00 [queued]> finished (failed) although the task says its queued. Was the task killed externally? [2020-07-15 14:30:51,357] {dagbag.py:396} INFO - Filling up the DagBag from /opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> jdbc:clickhouse://192.168.10.186:8123/ods ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> jdbc:clickhouse://192.168.10.186:8123/ods ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> jdbc:clickhouse://192.168.10.186:8123/ods ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> jdbc:clickhouse://192.168.10.186:8123/ods [2020-07-15 14:30:51,511] {taskinstance.py:1150} ERROR - Executor reports task instance <TaskInstance: user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc 2020-07-15 14:30:36.508244+00:00 [queued]> finished (failed) although the task says its queued. Was the task killed externally? NoneType: None [2020-07-15 14:30:51,620] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=user_MySql_2_ClickHouse_increment_srt_Activity, task_id=MySql_2_ClickHouse_Activity_Activity_inc, execution_date=20200715T143036, start_date=, end_date=20200715T143051 [2020-07-15 14:31:15,231] {scheduler_job.py:1316} INFO - Executor reports execution of user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc execution_date=2020-07-15 14:30:36.508244+00:00 exited with status success for try_number 1 **What you expected to happen**: Shell commands can be executed correctly on the server, but this problem will occur occasionally. The probability is very high. I hope to put forward solutions. I will actively cooperate with you. Thank you! <!-- What do you think went wrong? --> Is it because of Python version problem or airflow version problem **How to reproduce it**: <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md sytle of  To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**:I saw someone in the community who had the same problem but didn't get a solution <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> --> Every execution has the same problem ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
