This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git
The following commit(s) were added to refs/heads/main by this push:
new 0f6011c [SPARK-49397] Add `Clean Up` section to `README.md`
0f6011c is described below
commit 0f6011cccae003a0e9fb5b58ea1bf98ac78e54d8
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Mon Aug 26 12:53:21 2024 -0700
[SPARK-49397] Add `Clean Up` section to `README.md`
### What changes were proposed in this pull request?
This PR aims to document `Clean Up` steps in `README.md` .
### Why are the changes needed?
It's helpful for users to provide clean-up steps explicitly because this is
always used during trouble-shooting.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Manual review.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #106 from dongjoon-hyun/SPARK-49397.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
README.md | 27 +++++++++++++++++++++++++++
1 file changed, 27 insertions(+)
diff --git a/README.md b/README.md
index a4d89f2..e9cdee7 100644
--- a/README.md
+++ b/README.md
@@ -117,6 +117,33 @@ $ helm install spark-kubernetes-operator \
https://nightlies.apache.org/spark/charts/spark-kubernetes-operator-0.1.0-SNAPSHOT.tgz
```
+## Clean Up
+
+Check the existing Spark applications and clusters. If exists, delete them.
+
+```
+$ kubectl get sparkapp
+No resources found in default namespace.
+
+$ kubectl get sparkcluster
+No resources found in default namespace.
+```
+
+Remove HelmChart and CRDs.
+
+```
+$ helm uninstall spark-kubernetes-operator
+
+$ kubectl delete crd sparkapplications.spark.apache.org
+
+$ kubectl delete crd sparkclusters.spark.apache.org
+```
+
+In case of nightly builds, remove the snapshot image.
+```
+$ docker rmi apache/spark-kubernetes-operator:main-snapshot
+```
+
## Contributing
Please review the [Contribution to Spark
guide](https://spark.apache.org/contributing.html)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]