eflemist opened a new issue #14834:
URL: https://github.com/apache/airflow/issues/14834
airflow 1.10.7
ubuntu
am new to airflow and am trying to run 1st DAG thru scheduler but am seeing
the flowing error:
[2021-03-16 10:32:23,969] {scheduler_job.py:960} INFO - Figuring out tasks
to run in Pool(name=default_pool) with 128 open slots and 1 task instances
ready to be queued
[2021-03-16 10:32:23,970] {scheduler_job.py:988} INFO - DAG lesson1.demo1
has 0/16 running and queued tasks
[2021-03-16 10:32:23,978] {scheduler_job.py:1038} INFO - Setting the
following tasks to queued state:
<TaskInstance: lesson1.demo1.greet_task 2021-03-16
14:32:08.873884+00:00 [scheduled]>
[2021-03-16 10:32:23,996] {scheduler_job.py:1112} INFO - Setting the
following 1 tasks to queued state:
<TaskInstance: lesson1.demo1.greet_task 2021-03-16
14:32:08.873884+00:00 [queued]>
[2021-03-16 10:32:23,997] {scheduler_job.py:1148} INFO - Sending
('lesson1.demo1', 'greet_task', datetime.datetime(2021, 3, 16, 14, 32, 8,
873884, tzinfo=<Timezone [UTC]>), 1) to executor with priority 1 and queue
default
[2021-03-16 10:32:24,000] {base_executor.py:58} INFO - Adding to queue:
['airflow', 'run', 'lesson1.demo1', 'greet_task',
'2021-03-16T14:32:08.873884+00:00', '--local', '--pool', 'default_pool', '-sd',
'/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/example_dags/demo1.py']
[2021-03-16 10:32:24,005] {sequential_executor.py:45} INFO - Executing
command: ['airflow', 'run', 'lesson1.demo1', 'greet_task',
'2021-03-16T14:32:08.873884+00:00', '--local', '--pool', 'default_pool', '-sd',
'/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/example_dags/demo1.py']
[2021-03-16 10:32:24,035] {scheduler_job.py:1361} ERROR - Exception when
executing execute_helper
Traceback (most recent call last):
File
"/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py",
line 1359, in _execute
self._execute_helper()
File
"/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py",
line 1420, in _execute_helper
if not
self._validate_and_run_task_instances(simple_dag_bag=simple_dag_bag):
File
"/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py",
line 1482, in _validate_and_run_task_instances
self.executor.heartbeat()
File
"/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/executors/base_executor.py",
line 134, in heartbeat
self.sync()
File
"/home/@@@@@@@/.local/lib/python3.6/site-packages/airflow/executors/sequential_executor.py",
line 48, in sync
subprocess.check_call(command, close_fds=True)
File "/usr/local/lib/python3.6/subprocess.py", line 286, in check_call
retcode = call(*popenargs, **kwargs)
File "/usr/local/lib/python3.6/subprocess.py", line 267, in call
with Popen(*popenargs, **kwargs) as p:
File "/usr/local/lib/python3.6/subprocess.py", line 707, in __init__
restore_signals, start_new_session)
File "/usr/local/lib/python3.6/subprocess.py", line 1326, in _execute_child
raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: 'airflow'
Any ideas for troubleshooting?
- I have check/read issue #11309 and that seems to be different scenario as
I am not using systemd but just launching airflow scheduler from the terminal
in my home directory
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]