[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16705871#comment-16705871 ] Roger Kaufmann commented on AIRFLOW-3036: - I ran into this issue too on Swisscom Cloud Foundry (only relational db option is MariaDB). A solution was posted in the mailing list by Feng Lu which enabled me to install it but since we're just starting to try out airflow I can not confirm whether it is a good long term solution: [https://www.mail-archive.com/dev@airflow.incubator.apache.org/msg06640.html] > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16669256#comment-16669256 ] Bolke de Bruin commented on AIRFLOW-3036: - Google is testing an alternative. Meanwhile you could try to replace `cur = conn.execute({color:#a5c261}"SELECT @@explicit_defaults_for_timestamp"{color})` in airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py by `cur = conn.execute({color:#a5c261}"SET {{explicit_defaults_for_timestamp=1}}"{color})` > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16648513#comment-16648513 ] neil90 commented on AIRFLOW-3036: - [~ashb], I dont mind giving this a shot. > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16629949#comment-16629949 ] Ash Berlin-Taylor commented on AIRFLOW-3036: This is going to need someone who is familiar with MySQL handling of timestamp and datetime columsn to experiment and check that Mysql does right things with these columns, even if the TZ of the db server is set to something other than UTC, and to make sure that it doesn't play silly games with default values (which is what the setting we require turns off) > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16629341#comment-16629341 ] Iuliia Volkova commented on AIRFLOW-3036: - [~smith-m] please set task unassigned > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Assignee: Iuliia Volkova >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > Fix For: 2.0.0 > > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16629339#comment-16629339 ] Iuliia Volkova commented on AIRFLOW-3036: - [~smith-m] I'm not sure what somebody could resolve it without Bolke [~bolke] > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Assignee: Iuliia Volkova >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > Fix For: 2.0.0 > > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16617483#comment-16617483 ] ASF GitHub Bot commented on AIRFLOW-3036: - Fokko closed pull request #3908: [AIRFLOW-3036] Add relevant ECS options to ECS operator. URL: https://github.com/apache/incubator-airflow/pull/3908 This is a PR merged from a forked repository. As GitHub hides the original diff on merge, it is displayed below for the sake of provenance: As this is a foreign pull request (from a fork), the diff is supplied below (as it won't show otherwise due to GitHub magic): diff --git a/airflow/contrib/operators/ecs_operator.py b/airflow/contrib/operators/ecs_operator.py index c85ae15b77..8bad285ffd 100644 --- a/airflow/contrib/operators/ecs_operator.py +++ b/airflow/contrib/operators/ecs_operator.py @@ -45,6 +45,15 @@ class ECSOperator(BaseOperator): :type region_name: str :param launch_type: the launch type on which to run your task ('EC2' or 'FARGATE') :type launch_type: str +:param group: the name of the task group associated with the task +:type group: str +:param placement_constraints: an array of placement constraint objects to use for +the task +:type placement_constraints: list +:param platform_version: the platform version on which your task is running +:type platform_version: str +:param network_configuration: the network configuration for the task +:type network_configuration: dict """ ui_color = '#f0ede4' @@ -54,7 +63,9 @@ class ECSOperator(BaseOperator): @apply_defaults def __init__(self, task_definition, cluster, overrides, - aws_conn_id=None, region_name=None, launch_type='EC2', **kwargs): + aws_conn_id=None, region_name=None, launch_type='EC2', + group=None, placement_constraints=None, platform_version='LATEST', + network_configuration=None, **kwargs): super(ECSOperator, self).__init__(**kwargs) self.aws_conn_id = aws_conn_id @@ -63,6 +74,10 @@ def __init__(self, task_definition, cluster, overrides, self.cluster = cluster self.overrides = overrides self.launch_type = launch_type +self.group = group +self.placement_constraints = placement_constraints +self.platform_version = platform_version +self.network_configuration = network_configuration self.hook = self.get_hook() @@ -78,13 +93,21 @@ def execute(self, context): region_name=self.region_name ) -response = self.client.run_task( -cluster=self.cluster, -taskDefinition=self.task_definition, -overrides=self.overrides, -startedBy=self.owner, -launchType=self.launch_type -) +run_opts = { +'cluster': self.cluster, +'taskDefinition': self.task_definition, +'overrides': self.overrides, +'startedBy': self.owner, +'launchType': self.launch_type, +'platformVersion': self.platform_version, +} +if self.group is not None: +run_opts['group'] = self.group +if self.placement_constraints is not None: +run_opts['placementConstraints'] = self.placement_constraints +if self.network_configuration is not None: +run_opts['networkConfiguration'] = self.network_configuration +response = self.client.run_task(**run_opts) failures = response['failures'] if len(failures) > 0: diff --git a/tests/contrib/operators/test_ecs_operator.py b/tests/contrib/operators/test_ecs_operator.py index 43a816da4a..842db1a44a 100644 --- a/tests/contrib/operators/test_ecs_operator.py +++ b/tests/contrib/operators/test_ecs_operator.py @@ -69,7 +69,20 @@ def setUp(self, aws_hook_mock): cluster='c', overrides={}, aws_conn_id=None, -region_name='eu-west-1') +region_name='eu-west-1', +group='group', +placement_constraints=[ +{ +'expression': 'attribute:ecs.instance-type =~ t2.*', +'type': 'memberOf' +} +], +network_configuration={ +'awsvpcConfiguration': { +'securityGroups': ['sg-123abc'] +} +} +) def test_init(self): @@ -100,7 +113,20 @@ def test_execute_without_failures(self, check_mock, wait_mock): launchType='EC2', overrides={}, startedBy=mock.ANY, # Can by 'airflow' or 'Airflow' -taskDefinition='t' +taskDefinition='t', +group='group', +placementConstraints=[ +{ +'expression': 'attribute:ecs.instance-type =~
[jira] [Commented] (AIRFLOW-3036) Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL
[ https://issues.apache.org/jira/browse/AIRFLOW-3036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16614546#comment-16614546 ] Iuliia Volkova commented on AIRFLOW-3036: - [~bolke] , hi Bolke, I saw what explicit_defaults_for_timestamp flag was added by you - [https://github.com/apache/incubator-airflow/pull/2979] , so we need you to get some opinion and information for those task, what we should to do.. explicit_defaults_for_timestamp not in a list of supported by Google Cloud SQL flags - [https://cloud.google.com/sql/docs/mysql/flags] > Upgrading to Airflow 1.10 not possible using GCP Cloud SQL for MYSQL > > > Key: AIRFLOW-3036 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3036 > Project: Apache Airflow > Issue Type: Bug > Components: core, db >Affects Versions: 1.10.0 > Environment: Google Cloud Platform, Google Kubernetes Engine, Airflow > 1.10 on Debian Stretch, Google Cloud SQL MySQL >Reporter: Smith Mathieu >Priority: Blocker > Labels: 1.10, google, google-cloud-sql > > The upgrade path to airflow 1.10 seems impossible for users of MySQL in > Google's Cloud SQL service given new mysql requirements for 1.10. > > When executing "airflow upgradedb" > ``` > INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> > 0e2a74e0fc9f, Add time zone awareness > Traceback (most recent call last): > File "/usr/local/bin/airflow", line 32, in > args.func(args) > File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 1002, > in initdb > db_utils.initdb(settings.RBAC) > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 92, > in initdb > upgradedb() > File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 346, > in upgradedb > command.upgrade(config, 'heads') > File "/usr/local/lib/python3.6/site-packages/alembic/command.py", line 174, > in upgrade > script.run_env() > File "/usr/local/lib/python3.6/site-packages/alembic/script/base.py", line > 416, in run_env > util.load_python_file(self.dir, 'env.py') > File "/usr/local/lib/python3.6/site-packages/alembic/util/pyfiles.py", line > 93, in load_python_file > module = load_module_py(module_id, path) > File "/usr/local/lib/python3.6/site-packages/alembic/util/compat.py", line > 68, in load_module_py > module_id, path).load_module(module_id) > File "", line 399, in > _check_name_wrapper > File "", line 823, in load_module > File "", line 682, in load_module > File "", line 265, in _load_module_shim > File "", line 684, in _load > File "", line 665, in _load_unlocked > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 91, in > run_migrations_online() > File "/usr/local/lib/python3.6/site-packages/airflow/migrations/env.py", > line 86, in run_migrations_online > context.run_migrations() > File "", line 8, in run_migrations > File > "/usr/local/lib/python3.6/site-packages/alembic/runtime/environment.py", line > 807, in run_migrations > self.get_context().run_migrations(**kw) > File "/usr/local/lib/python3.6/site-packages/alembic/runtime/migration.py", > line 321, in run_migrations > step.migration_fn(**kw) > File > "/usr/local/lib/python3.6/site-packages/airflow/migrations/versions/0e2a74e0fc9f_add_time_zone_awareness.py", > line 46, in upgrade > raise Exception("Global variable explicit_defaults_for_timestamp needs to be > on (1) for mysql") > Exception: Global variable explicit_defaults_for_timestamp needs to be on > (1) for mysql > ``` > > Reading documentation for upgrading to airflow 1.10, it seems the requirement > for explicit_defaults_for_timestamp=1 was intentional. > > However, MySQL on Google Cloud SQL does not support configuring this > variable and it is off by default. Users of MySQL and Cloud SQL do not have > an upgrade path to 1.10. Alas, so close to the mythical Kubernetes Executor. > In GCP, Cloud SQL is _the_ hosted MySQL solution. > [https://cloud.google.com/sql/docs/mysql/flags] -- This message was sent by Atlassian JIRA (v7.6.3#76005)