anujjamwal opened a new issue #12400:
URL: https://github.com/apache/airflow/issues/12400
I am trying to setup a installation of airflow using the docker image
2.0.0b2-python3.7. I am using MySQL 5.7 from GCP CloudSQL. The scheduler keeps
exiting with below error
```
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2020-11-17 04:09:34,142] {scheduler_job.py:1249} INFO - Starting the
scheduler
[2020-11-17 04:09:34,143] {scheduler_job.py:1254} INFO - Processing each
file at most -1 times
[2020-11-17 04:09:34,144] {kubernetes_executor.py:520} INFO - Start
Kubernetes executor
[2020-11-17 04:09:34,169] {kubernetes_executor.py:126} INFO - Event: and now
my watch begins starting at resource_version: 0
[2020-11-17 04:09:34,259] {kubernetes_executor.py:462} INFO - When executor
started up, found 0 queued task instances
[2020-11-17 04:09:34,306] {dag_processing.py:250} INFO - Launched
DagFileProcessorManager with pid: 39
[2020-11-17 04:09:34,308] {scheduler_job.py:1761} INFO - Resetting orphaned
tasks for active dag runs
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332
DeprecationWarning: The logging_level option in has been moved to the
logging_level option in - the old setting has been used, but please update
your
config.
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332
DeprecationWarning: The fab_logging_level option in has been moved to the
fab_logging_level option in - the old setting has been used, but please
update
your config.
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332
DeprecationWarning: The remote_logging option in has been moved to the
remote_logging option in - the old setting has been used, but please update
your config.
[2020-11-17 04:09:34,322] {settings.py:52} INFO - Configured default
timezone Timezone('UTC')
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332
DeprecationWarning: The task_log_reader option in has been moved to the
task_log_reader option in - the old setting has been used, but please
update
your config.
[2020-11-17 04:09:34,491] {scheduler_job.py:1301} ERROR - Exception when
executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/conversion.py",
line 183, in to_mysql
return getattr(self, "_{0}_to_mysql".format(type_name))(value)
AttributeError: 'MySQLConverter' object has no attribute
'_dagruntype_to_mysql'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py",
line 410, in _process_params_dict
conv = to_mysql(conv)
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/conversion.py",
line 186, in to_mysql
"MySQL type".format(type_name))
TypeError: Python 'dagruntype' cannot be converted to a MySQL type
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
line 1277, in _execute_context
cursor, statement, parameters, context
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
line 593, in do_execute
cursor.execute(statement, parameters)
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py",
line 555, in execute
stmt, self._process_params_dict(params))
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py",
line 419, in _process_params_dict
"Failed processing pyformat-parameters; %s" % err)
mysql.connector.errors.ProgrammingError: Failed processing
pyformat-parameters; Python 'dagruntype' cannot be converted to a MySQL type
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
line 1283, in _execute
self._run_scheduler_loop()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
line 1357, in _run_scheduler_loop
self.adopt_or_reset_orphaned_tasks()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py",
line 63, in wrapper
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py",
line 1797, in adopt_or_reset_orphaned_tasks
tis_to_reset_or_adopt = with_row_locks(query, of=TI,
**skip_locked(session=session)).all()
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
line 3346, in all
return list(self)
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
line 3508, in __iter__
return self._execute_and_instances(context)
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
line 3533, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
line 1011, in execute
return meth(self, multiparams, params)
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py",
line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
line 1130, in _execute_clauseelement
distilled_params,
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
line 1317, in _execute_context
e, statement, parameters, cursor, context
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
line 1511, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/compat.py",
line 182, in raise_
raise exception
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py",
line 1277, in _execute_context
cursor, statement, parameters, context
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py",
line 593, in do_execute
cursor.execute(statement, parameters)
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py",
line 555, in execute
stmt, self._process_params_dict(params))
File
"/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py",
line 419, in _process_params_dict
"Failed processing pyformat-parameters; %s" % err)
sqlalchemy.exc.ProgrammingError: (mysql.connector.errors.ProgrammingError)
Failed processing pyformat-parameters; Python 'dagruntype' cannot be converted
to a MySQL type
[SQL: SELECT task_instance.task_id AS task_instance_task_id,
task_instance.dag_id AS task_instance_dag_id, task_instance.execution_date AS
task_instance_execution_date
FROM task_instance LEFT OUTER JOIN job ON job.id =
task_instance.queued_by_job_id INNER JOIN dag_run ON task_instance.dag_id =
dag_run.dag_id AND task_instance.execution_date = dag_run.execution_date
WHERE task_instance.state IN (%(state_1)s, %(state_2)s, %(state_3)s) AND
(task_instance.queued_by_job_id IS NULL OR job.state != %(state_4)s) AND
dag_run.run_type != %(run_type_1)s AND dag_run.state = %(state_5)s FOR UPDATE]
[parameters: {'state_1': 'scheduled', 'state_2': 'queued', 'state_3':
'running', 'state_4': 'running', 'run_type_1': <DagRunType.BACKFILL_JOB:
'backfill'>, 'state_5': 'running'}]
(Background on this error at: http://sqlalche.me/e/13/f405)
[2020-11-17 04:09:35,498] {process_utils.py:95} INFO - Sending
Signals.SIGTERM to GPID 39
[2020-11-17 04:09:35,711] {process_utils.py:61} INFO - Process
psutil.Process(pid=39, status='terminated', exitcode=0, started='04:09:34')
(39) terminated with exit code 0
[2020-11-17 04:09:35,712] {scheduler_job.py:1304} INFO - Exited execute loop
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]