rishabhgarg7 opened a new issue #9053:
URL: https://github.com/apache/airflow/issues/9053
When I'm doing airflow scheduler on my local machine, it's giving the
following SQLAlchemy error:
File
"/home/ubuntu/anaconda3/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
line 590, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such column:
task_instance.pool_slots
[SQL: SELECT task_instance.try_number AS task_instance_try_number,
task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS
task_instance_dag_id, task_instance.execution_date AS
task_instance_execution_date, task_instance.start_date AS
task_instance_start_date, task_instance.end_date AS task_instance_end_date,
task_instance.duration AS task_instance_duration, task_instance.state AS
task_instance_state, task_instance.max_tries AS task_instance_max_tries,
task_instance.hostname AS task_instance_hostname, task_instance.unixname AS
task_instance_unixname, task_instance.job_id AS task_instance_job_id,
task_instance.pool AS task_instance_pool, task_instance.pool_slots AS
task_instance_pool_slots, task_instance.queue AS task_instance_queue,
task_instance.priority_weight AS task_instance_priority_weight,
task_instance.operator AS task_instance_operator, task_instance.queued_dttm AS
task_instance_queued_dttm, task_instance.pid AS task_instance_pid,
task_instance.executor_config AS task_instance_executor_config
FROM task_instance JOIN dag_run ON task_instance.dag_id = dag_run.dag_id AND
task_instance.execution_date = dag_run.execution_date
WHERE dag_run.state = ? AND dag_run.run_id NOT LIKE ? AND
task_instance.state IN (?, ?)]
[parameters: ('running', 'backfill_%', 'scheduled', 'queued')]
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]