alexxi0213 opened a new issue #22103:
URL: https://github.com/apache/airflow/issues/22103
### Apache Airflow version
2.2.3
### What happened
I configure mysql as airflow backend DB running one simple DAG just one
operator task move data from one mysql to another.
I found my daily task always fail and there is no logs for that, seems the
DAG is never scheduled. I checked the log in the worker container and found
below errors:
```
sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError)
(4031, 'The client was disconnected by the server because of inactivity. See
wait_timeout and interactive_timeout for configuring this behavior.')
[SQL: SELECT celery_taskmeta.id AS celery_taskmeta_id,
celery_taskmeta.task_id AS celery_taskmeta_task_id, celery_taskmeta.`status` AS
celery_taskmeta_status, celery_taskmeta.result AS celery_taskmeta_result,
celery_taskmeta.date_done AS celery_taskmeta_date_done,
celery_taskmeta.traceback AS celery_taskmeta_traceback
FROM celery_taskmeta
WHERE celery_taskmeta.task_id = %s]
```
I have already add some the env variable in the docker-compose like:
```
environment:
&airflow-common-env
AIRFLOW__CORE__SQL_ALCHEMY_POOL_RECYCLE: 1800
AIRFLOW__CORE__SQL_ALCHEMY_POOL_PRE_PING: 'true'
```
I thought this will allow the task to recreate the DB connection rather that
reused the old one?
I guess the airflow work is using the old DB connection and that one is
closed by Mysql but seems the `SQL_ALCHEMY_POOL_PRE_PING` is not taking effect
on worker.
### What you expected to happen
When add setting like `SQL_ALCHEMY_POOL_PRE_PING` it should take effect on
airflow worker.
### How to reproduce
Use mysql as backend store and run a daily task like:
```
from airflow import DAG
from datetime import datetime
from operators.my_operator import MySqlToMySqlOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2022, 2, 26)
}
dag = DAG(
'test_mysql_to_mysql',
default_args=default_args,
description='MySqlOperatorExample',
schedule_interval="@daily"
)
mysqlquery = """
select * from xxx;
"""
load_mysql_to_mysql = MySqlToMySqlOperator(
sql= mysqlquery,
src_mysql_conn_id = "from",
task_id = "load_mysql_to_mysql",
des_mysql_conn_id = "to",
dag=dag
)
if __name__ == '__main__':
dag.cli()
```
### Operating System
centos7
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
Official docker compose file by use mysql as backend:
```
services:
mysql:
image: mysql:8.0.27
cap_add:
- SYS_NICE
ports:
- "3308:3306"
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_USER: airflow
MYSQL_PASSWORD: xxxxx
MYSQL_DATABASE: airflow
command:
- --default-authentication-plugin=mysql_native_password
- --collation-server=utf8mb4_general_ci
- --character-set-server=utf8mb4
volumes:
- mysql-db-volume:/var/lib/mysql
- mysql-conf-volume:/etc/my.cnf
healthcheck:
test: mysql --user=$$MYSQL_USER --password=$$MYSQL_PASSWORD -e 'SHOW
DATABASES;' # healthcheck command
interval: 5s
retries: 5
restart: always
```
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]