fgalind1 commented on code in PR #31066: URL: https://github.com/apache/airflow/pull/31066#discussion_r1278417861
########## docs/helm-chart/index.rst: ########## @@ -155,3 +155,31 @@ To run database migrations with Argo CD automatically, you will need to add: This will run database migrations every time there is a ``Sync`` event in Argo CD. While it is not ideal to run the migrations on every sync, it is a trade-off that allows them to be run automatically. If you use the Celery(Kubernetes)Executor with the built-in Redis, it is recommended that you set up a static Redis password either by supplying ``redis.passwordSecretName`` and ``redis.data.brokerUrlSecretName`` or ``redis.password``. + + +Naming Conventions +------------------ + +For new installations it is highly recommended to start using standard naming conventions. +It is not enabled by default as this may cause unexpected behaviours on existing installations. However you can enable it using ``--set useStandardNaming=true`` + +.. code-block:: bash + + helm upgrade airflow -n airflow . --set useStandardNaming=true + +For existing installations, all your resources will be recreated with a new name and helm will delete previous resources. + +This won't delete existing PVCs for logs used by statefulset/deployments, but it will recreate them with brand new PVCs. +If you do want to preserve logs history you'll need to manually copy the data of these volumes into the new volumes after +deployment. Depending on what storage backend/class you're using this procedure may vary. If you don't mind starting +with fresh logs/redis volumes, you can just delete the old pvcs that will be names, for example: + +.. code-block:: bash + + kubectl delete pvc -n airflow logs-gta-triggerer-0 + kubectl delete pvc -n airflow logs-gta-worker-0 + kubectl delete pvc -n airflow redis-db-gta-redis-0 Review Comment: added in https://github.com/apache/airflow/pull/31066/commits/a1f03ecc2baf0b78fb5144735fb333e2765d0b02 ########## docs/helm-chart/index.rst: ########## @@ -155,3 +155,31 @@ To run database migrations with Argo CD automatically, you will need to add: This will run database migrations every time there is a ``Sync`` event in Argo CD. While it is not ideal to run the migrations on every sync, it is a trade-off that allows them to be run automatically. If you use the Celery(Kubernetes)Executor with the built-in Redis, it is recommended that you set up a static Redis password either by supplying ``redis.passwordSecretName`` and ``redis.data.brokerUrlSecretName`` or ``redis.password``. + + +Naming Conventions +------------------ + +For new installations it is highly recommended to start using standard naming conventions. +It is not enabled by default as this may cause unexpected behaviours on existing installations. However you can enable it using ``--set useStandardNaming=true`` + +.. code-block:: bash + + helm upgrade airflow -n airflow . --set useStandardNaming=true Review Comment: updated in https://github.com/apache/airflow/pull/31066/commits/a1f03ecc2baf0b78fb5144735fb333e2765d0b02 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
