scrawfor edited a comment on issue #13799:
URL: https://github.com/apache/airflow/issues/13799#issuecomment-771753922
@kaxil Sure.
@lgwacker - Unfortunately that did not work for me. ~~Even renaming the dag
doesn't seem to have solved the issue.~~ Renaming the dag did fix my issue,
although I had to restart the service twice.
```sh
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2021-02-02 15:03:30,310] {scheduler_job.py:1241} INFO - Starting the
scheduler
[2021-02-02 15:03:30,310] {scheduler_job.py:1246} INFO - Processing each
file at most -1 times
[2021-02-02 15:03:30,413] {dag_processing.py:250} INFO - Launched
DagFileProcessorManager with pid: 118
[2021-02-02 15:03:30,414] {scheduler_job.py:1751} INFO - Resetting orphaned
tasks for active dag runs
[2021-02-02 15:03:30,463] {settings.py:52} INFO - Configured default
timezone Timezone('America/New_York')
[2021-02-02 15:03:30,643] {scheduler_job.py:938} INFO - 32 tasks up for
execution:
<TASK LIST WAS HERE>
[2021-02-02 15:03:30,652] {scheduler_job.py:967} INFO - Figuring out tasks
to run in Pool(name=default_pool) with 128 open slots and 32 task instances
ready to be queued
[2021-02-02 15:03:30,652] {scheduler_job.py:995} INFO - DAG <dag1> has 0/16
running and queued tasks
[2021-02-02 15:03:30,652] {scheduler_job.py:995} INFO - DAG <dag1> has 1/16
running and queued tasks
[2021-02-02 15:03:30,652] {scheduler_job.py:995} INFO - DAG <dag1> has 2/16
running and queued tasks
[2021-02-02 15:03:30,653] {scheduler_job.py:995} INFO - DAG <dag1> has 3/16
running and queued tasks
[2021-02-02 15:03:30,653] {scheduler_job.py:995} INFO - DAG <dag1> has 4/16
running and queued tasks
[2021-02-02 15:03:30,653] {scheduler_job.py:995} INFO - DAG <dag1> has 5/16
running and queued tasks
[2021-02-02 15:03:30,653] {scheduler_job.py:995} INFO - DAG <dag2> has 0/16
running and queued tasks
[2021-02-02 15:03:30,653] {scheduler_job.py:995} INFO - DAG <dag2> has 1/16
running and queued tasks
[2021-02-02 15:03:30,654] {scheduler_job.py:995} INFO - DAG <dag2> has 2/16
running and queued tasks
[2021-02-02 15:03:30,654] {scheduler_job.py:995} INFO - DAG <dag2> has 3/16
running and queued tasks
[2021-02-02 15:03:30,654] {scheduler_job.py:995} INFO - DAG <dag2> has 4/16
running and queued tasks
[2021-02-02 15:03:30,661] {scheduler_job.py:1293} ERROR - Exception when
executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1275, in _execute
self._run_scheduler_loop()
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1377, in _run_scheduler_loop
num_queued_tis = self._do_scheduling(session)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1533, in _do_scheduling
num_queued_tis =
self._critical_section_execute_task_instances(session=session)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1132, in _critical_section_execute_task_instances
queued_tis = self._executable_task_instances_to_queued(max_tis,
session=session)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py",
line 62, in wrapper
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 1034, in _executable_task_instances_to_queued
if task_instance.pool_slots > open_slots:
TypeError: '>' not supported between instances of 'NoneType' and 'int'
[2021-02-02 15:03:31,668] {process_utils.py:95} INFO - Sending
Signals.SIGTERM to GPID 118
[2021-02-02 15:03:32,157] {process_utils.py:61} INFO - Process
psutil.Process(pid=164, status='terminated', started='15:03:30') (164)
terminated with exit code None
[2021-02-02 15:03:32,175] {process_utils.py:61} INFO - Process
psutil.Process(pid=172, status='terminated', started='15:03:30') (172)
terminated with exit code None
[2021-02-02 15:03:32,177] {process_utils.py:201} INFO - Waiting up to 5
seconds for processes to exit...
[2021-02-02 15:03:32,188] {process_utils.py:61} INFO - Process
psutil.Process(pid=118, status='terminated', exitcode=0, started='15:03:29')
(118) terminated with exit code 0
[2021-02-02 15:03:32,189] {scheduler_job.py:1296} INFO - Exited execute loop
Process QueuedLocalWorker-29:
Process QueuedLocalWorker-31:
Process QueuedLocalWorker-33:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in
_bootstrap
self.run()
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py",
line 69, in run
return super().run()
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in
run
self._target(*self._args, **self._kwargs)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py",
line 176, in do_work
key, command = self.task_queue.get()
File "<string>", line 2, in get
File "/usr/local/lib/python3.8/multiprocessing/managers.py", line 835, in
_callmethod
kind, result = conn.recv()
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 250,
in recv
buf = self._recv_bytes()
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 414,
in _recv_bytes
buf = self._recv(4)
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 383,
in _recv
raise EOFError
EOFError
Traceback (most recent call last):
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in
_bootstrap
self.run()
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py",
line 69, in run
return super().run()
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in
run
self._target(*self._args, **self._kwargs)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py",
line 176, in do_work
key, command = self.task_queue.get()
File "<string>", line 2, in get
File "/usr/local/lib/python3.8/multiprocessing/managers.py", line 835, in
_callmethod
kind, result = conn.recv()
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 250,
in recv
buf = self._recv_bytes()
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 414,
in _recv_bytes
buf = self._recv(4)
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 383,
in _recv
raise EOFError
EOFError
Traceback (most recent call last):
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in
_bootstrap
self.run()
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py",
line 69, in run
return super().run()
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in
run
self._target(*self._args, **self._kwargs)
File
"/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/local_executor.py",
line 176, in do_work
key, command = self.task_queue.get()
File "<string>", line 2, in get
File "/usr/local/lib/python3.8/multiprocessing/managers.py", line 835, in
_callmethod
kind, result = conn.recv()
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 250,
in recv
buf = self._recv_bytes()
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 414,
in _recv_bytes
buf = self._recv(4)
File "/usr/local/lib/python3.8/multiprocessing/connection.py", line 383,
in _recv
raise EOFError
EOFError
Process QueuedLocalWorker-32:
Process QueuedLocalWorker-26:
Process QueuedLocalWorker-30:
```
*Env Details:*
* Docker Image: apache/airflow:2.0.0-python3.8
* Postgres Metadata DB
* Local Executor
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]