kaxil commented on issue #5743: [AIRFLOW-5088][AIP-24] Persisting serialized DAG in DB for webserver scalability URL: https://github.com/apache/airflow/pull/5743#issuecomment-529024391 @ashb @coufon With the above commit: https://github.com/apache/airflow/pull/5743/commits/951cee34e6ecb60bfa8b3f3f946938a43caa5a3a I have basically enforced JSON column type. The following code (Ash had suggested this one) that we might want to use to be a little flexible and let people use old versions of MySQL and use JSONB for postgres: ``` json_type = sa.JSON conn = op.get_bind() # pylint: disable=no-member if conn.dialect.name == "mysql": # Mysql 5.7+/MariaDB 10.2.3 has JSON support. Rather than checking for # versions, check for the function existing. try: conn.execute("SELECT JSON_VALID(1)").fetchone() except sa.exc.OperationalError: json_type = sa.UnicodeText elif conn.dialect.name == "postgresql": json_type = postgresql.JSONB ``` Few related notes that I would like to know your feedback: * Do we want to let users use old versions of MySQL (and Mariadb that doesn't support JSON with sqlalchemy) or add a note and tell them that if you want to use serialisation you would need to use newer version of DB or a DB (Postgres) that supports JSON columns. Or else just keep `store_serialised_dags=False`. If we want to support old versions or DB that don't support JSON columns, that we will have to handle those exceptions and store the serialised DAG (python dicts) as string and use json.loads and json.dump. * JSON vs JSONB? - Opinions ??
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services