mik-laj commented on a change in pull request #18494:
URL: https://github.com/apache/airflow/pull/18494#discussion_r716870133



##########
File path: airflow/providers/google/cloud/example_dags/example_cloud_sql.py
##########
@@ -48,8 +49,8 @@
 from airflow.utils.dates import days_ago
 
 GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'example-project')
-INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql')
-INSTANCE_NAME2 = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME2', 'test-mysql2')
+INSTANCE_NAME = os.environ.get('GCSQL_MYSQL_INSTANCE_NAME', 'test-mysql') + 
str(random.getrandbits(16))

Review comment:
       @mnojek Airflow is a distributed application, which means that one DAG 
file is loaded multiple times on different nodes, so we have to make sure that 
this instance name has the same value on all nodes. These examples are used in 
system tests, where this condition is not necessary because we have a common 
memory, but these examples are also the inspiration for novice users who can 
use another executor e.g. CeleryExecutor, so each DAG will be loaded on each 
node separately.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to