zhbdesign commented on issue #9826:
URL: https://github.com/apache/airflow/issues/9826#issuecomment-658576075


   Check the scheduler to see the following log:
   
   [2020-07-15 14:30:40,779] {scheduler_job.py:958} INFO - 2 tasks up for 
execution:
        <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc
 2020-07-15 14:30:36.508244+00:00 [scheduled]>
        <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc
 2020-07-15 14:30:36.508244+00:00 [scheduled]>
   [2020-07-15 14:30:40,891] {scheduler_job.py:989} INFO - Figuring out tasks 
to run in Pool(name=default_pool) with 128 open slots and 2 task instances 
ready to be queued
   [2020-07-15 14:30:40,892] {scheduler_job.py:1017} INFO - DAG 
user_MySql_2_ClickHouse_increment_srt_Activity has 0/31 running and queued tasks
   [2020-07-15 14:30:40,892] {scheduler_job.py:1017} INFO - DAG 
user_MySql_2_ClickHouse_increment_srt_Activity has 1/31 running and queued tasks
   [2020-07-15 14:30:40,901] {scheduler_job.py:1067} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc
 2020-07-15 14:30:36.508244+00:00 [scheduled]>
        <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc
 2020-07-15 14:30:36.508244+00:00 [scheduled]>
   [2020-07-15 14:30:40,925] {scheduler_job.py:1141} INFO - Setting the 
following 2 tasks to queued state:
        <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc
 2020-07-15 14:30:36.508244+00:00 [queued]>
        <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc
 2020-07-15 14:30:36.508244+00:00 [queued]>
   [2020-07-15 14:30:40,925] {scheduler_job.py:1177} INFO - Sending 
('user_MySql_2_ClickHouse_increment_srt_Activity', 
'MySql_2_ClickHouse_Activity_Activity_Discuss_inc', datetime.datetime(2020, 7, 
15, 14, 30, 36, 508244, tzinfo=<Timezone [UTC]>), 1) to executor with priority 
1 and queue default
   [2020-07-15 14:30:40,926] {base_executor.py:58} INFO - Adding to queue: 
['airflow', 'run', 'user_MySql_2_ClickHouse_increment_srt_Activity', 
'MySql_2_ClickHouse_Activity_Activity_Discuss_inc', 
'2020-07-15T14:30:36.508244+00:00', '--local', '--pool', 'default_pool', '-sd', 
'/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py']
   [2020-07-15 14:30:40,926] {scheduler_job.py:1177} INFO - Sending 
('user_MySql_2_ClickHouse_increment_srt_Activity', 
'MySql_2_ClickHouse_Activity_Activity_inc', datetime.datetime(2020, 7, 15, 14, 
30, 36, 508244, tzinfo=<Timezone [UTC]>), 1) to executor with priority 1 and 
queue default
   [2020-07-15 14:30:40,927] {base_executor.py:58} INFO - Adding to queue: 
['airflow', 'run', 'user_MySql_2_ClickHouse_increment_srt_Activity', 
'MySql_2_ClickHouse_Activity_Activity_inc', '2020-07-15T14:30:36.508244+00:00', 
'--local', '--pool', 'default_pool', '-sd', 
'/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py']
   [2020-07-15 14:30:51,338] {scheduler_job.py:1316} INFO - Executor reports 
execution of 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc
 execution_date=2020-07-15 14:30:36.508244+00:00 exited with status failed for 
try_number 1
   [2020-07-15 14:30:51,356] {scheduler_job.py:1333} ERROR - Executor reports 
task instance <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc
 2020-07-15 14:30:36.508244+00:00 [queued]> finished (failed) although the task 
says its queued. Was the task killed externally?
   [2020-07-15 14:30:51,357] {dagbag.py:396} INFO - Filling up the DagBag from 
/opt/airflow/dags/MySql_2_ClickHouse_srt_Activity.py
   ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> 
jdbc:clickhouse://192.168.10.186:8123/ods
   ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> 
jdbc:clickhouse://192.168.10.186:8123/ods
   ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> 
jdbc:clickhouse://192.168.10.186:8123/ods
   ClickHouse_url======================>>>>>>>>>>>>>>>>>>>>>>> 
jdbc:clickhouse://192.168.10.186:8123/ods
   [2020-07-15 14:30:51,511] {taskinstance.py:1150} ERROR - Executor reports 
task instance <TaskInstance: 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_inc
 2020-07-15 14:30:36.508244+00:00 [queued]> finished (failed) although the task 
says its queued. Was the task killed externally?
   NoneType: None
   [2020-07-15 14:30:51,620] {taskinstance.py:1194} INFO - Marking task as 
FAILED. dag_id=user_MySql_2_ClickHouse_increment_srt_Activity, 
task_id=MySql_2_ClickHouse_Activity_Activity_inc, 
execution_date=20200715T143036, start_date=, end_date=20200715T143051
   [2020-07-15 14:31:15,231] {scheduler_job.py:1316} INFO - Executor reports 
execution of 
user_MySql_2_ClickHouse_increment_srt_Activity.MySql_2_ClickHouse_Activity_Activity_Discuss_inc
 execution_date=2020-07-15 14:30:36.508244+00:00 exited with status success for 
try_number 1


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to