This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 8f2e5cc84 Publish built docs triggered by
46b162c8442c486d2290fbd0cf3dd2f2d188c55e
8f2e5cc84 is described below
commit 8f2e5cc84ebe72701400645aed13adce2e74e6a7
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Wed Mar 19 21:50:49 2025 +0000
Publish built docs triggered by 46b162c8442c486d2290fbd0cf3dd2f2d188c55e
---
_sources/user-guide/kubernetes.md.txt | 38 +++++++++++++++++++++++++++++------
searchindex.js | 2 +-
user-guide/kubernetes.html | 36 +++++++++++++++++++++++++++------
3 files changed, 63 insertions(+), 13 deletions(-)
diff --git a/_sources/user-guide/kubernetes.md.txt
b/_sources/user-guide/kubernetes.md.txt
index 1788d3f9a..0d1418f83 100644
--- a/_sources/user-guide/kubernetes.md.txt
+++ b/_sources/user-guide/kubernetes.md.txt
@@ -37,16 +37,17 @@ found
[here](https://github.com/apache/datafusion-comet/tree/main/benchmarks).
Install helm Spark operator for Kubernetes
```bash
+# Add the Helm repository
helm repo add spark-operator https://kubeflow.github.io/spark-operator
-
helm repo update
-helm install my-release spark-operator/spark-operator --namespace
spark-operator --create-namespace --set webhook.enable=true
-````
+# Install the operator into the spark-operator namespace and wait for
deployments to be ready
+helm install spark-operator spark-operator/spark-operator --namespace
spark-operator --create-namespace --wait
+```
Check the operator is deployed
```bash
-helm status --namespace spark-operator my-release
+helm status --namespace spark-operator spark-operator
NAME: my-release
NAMESPACE: spark-operator
@@ -91,7 +92,7 @@ spec:
labels:
version: 3.5.4
instances: 1
- cores: 2
+ cores: 1
coreLimit: 1200m
memory: 512m
```
@@ -100,10 +101,35 @@ Refer to [Comet builds](#comet-docker-images)
Run Apache Spark application with Comet enabled
```bash
kubectl apply -f spark-pi.yaml
+sparkapplication.sparkoperator.k8s.io/spark-pi created
```
Check application status
```bash
-kubectl describe sparkapplication --namespace=spark-operator
+kubectl get sparkapp spark-pi
+
+NAME STATUS ATTEMPTS START FINISH AGE
+spark-pi RUNNING 1 2025-03-18T21:19:48Z <no value> 65s
+```
+To check more runtime details
+```bash
+kubectl describe sparkapplication spark-pi
+
+....
+Events:
+ Type Reason Age From
Message
+ ---- ------ ---- ----
-------
+ Normal SparkApplicationSubmitted 8m15s spark-application-controller
SparkApplication spark-pi was submitted successfully
+ Normal SparkDriverRunning 7m18s spark-application-controller
Driver spark-pi-driver is running
+ Normal SparkExecutorPending 7m11s spark-application-controller
Executor [spark-pi-68732195ab217303-exec-1] is pending
+ Normal SparkExecutorRunning 7m10s spark-application-controller
Executor [spark-pi-68732195ab217303-exec-1] is running
+ Normal SparkExecutorCompleted 7m5s spark-application-controller
Executor [spark-pi-68732195ab217303-exec-1] completed
+ Normal SparkDriverCompleted 7m4s spark-application-controller
Driver spark-pi-driver completed
+
+```
+
+Get Driver Logs
+```bash
+kubectl logs spark-pi-driver
```
More info on [Kube Spark
operator](https://www.kubeflow.org/docs/components/spark-operator/getting-started/)
\ No newline at end of file
diff --git a/searchindex.js b/searchindex.js
index 4c1e3bc85..2699d7ad6 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[9, "install-comet"]], "2.
Clone Spark and Apply Diff": [[9, "clone-spark-and-apply-diff"]], "3. Run Spark
SQL Tests": [[9, "run-spark-sql-tests"]], "ANSI mode": [[11, "ansi-mode"]],
"API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[10, null]], "Adding
Spark-side Tests for the New Expression": [[0,
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression":
[[0, [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[9, "install-comet"]], "2.
Clone Spark and Apply Diff": [[9, "clone-spark-and-apply-diff"]], "3. Run Spark
SQL Tests": [[9, "run-spark-sql-tests"]], "ANSI mode": [[11, "ansi-mode"]],
"API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[10, null]], "Adding
Spark-side Tests for the New Expression": [[0,
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression":
[[0, [...]
\ No newline at end of file
diff --git a/user-guide/kubernetes.html b/user-guide/kubernetes.html
index 486ec8801..0c10814c7 100644
--- a/user-guide/kubernetes.html
+++ b/user-guide/kubernetes.html
@@ -365,15 +365,16 @@ found <a class="reference external"
href="https://github.com/apache/datafusion-c
<section id="helm-chart">
<h2>Helm chart<a class="headerlink" href="#helm-chart" title="Link to this
heading">ΒΆ</a></h2>
<p>Install helm Spark operator for Kubernetes</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>helm<span class="w"> </span>repo<span
class="w"> </span>add<span class="w"> </span>spark-operator<span class="w">
</span>https://kubeflow.github.io/spark-operator
-
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span><span class="c1"># Add the Helm
repository</span>
+helm<span class="w"> </span>repo<span class="w"> </span>add<span class="w">
</span>spark-operator<span class="w">
</span>https://kubeflow.github.io/spark-operator
helm<span class="w"> </span>repo<span class="w"> </span>update
-helm<span class="w"> </span>install<span class="w"> </span>my-release<span
class="w"> </span>spark-operator/spark-operator<span class="w">
</span>--namespace<span class="w"> </span>spark-operator<span class="w">
</span>--create-namespace<span class="w"> </span>--set<span class="w">
</span>webhook.enable<span class="o">=</span><span class="nb">true</span>
+<span class="c1"># Install the operator into the spark-operator namespace and
wait for deployments to be ready</span>
+helm<span class="w"> </span>install<span class="w"> </span>spark-operator<span
class="w"> </span>spark-operator/spark-operator<span class="w">
</span>--namespace<span class="w"> </span>spark-operator<span class="w">
</span>--create-namespace<span class="w"> </span>--wait
</pre></div>
</div>
<p>Check the operator is deployed</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>helm<span class="w"> </span>status<span
class="w"> </span>--namespace<span class="w"> </span>spark-operator<span
class="w"> </span>my-release
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>helm<span class="w"> </span>status<span
class="w"> </span>--namespace<span class="w"> </span>spark-operator<span
class="w"> </span>spark-operator
NAME:<span class="w"> </span>my-release
NAMESPACE:<span class="w"> </span>spark-operator
@@ -417,7 +418,7 @@ spec:
<span class="w"> </span>labels:
<span class="w"> </span>version:<span class="w"> </span><span
class="m">3</span>.5.4
<span class="w"> </span>instances:<span class="w"> </span><span
class="m">1</span>
-<span class="w"> </span>cores:<span class="w"> </span><span
class="m">2</span>
+<span class="w"> </span>cores:<span class="w"> </span><span
class="m">1</span>
<span class="w"> </span>coreLimit:<span class="w"> </span>1200m
<span class="w"> </span>memory:<span class="w"> </span>512m
</pre></div>
@@ -425,10 +426,33 @@ spec:
<p>Refer to <a class="reference internal" href="#comet-docker-images">Comet
builds</a></p>
<p>Run Apache Spark application with Comet enabled</p>
<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>kubectl<span class="w"> </span>apply<span
class="w"> </span>-f<span class="w"> </span>spark-pi.yaml
+sparkapplication.sparkoperator.k8s.io/spark-pi<span class="w"> </span>created
</pre></div>
</div>
<p>Check application status</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>kubectl<span class="w">
</span>describe<span class="w"> </span>sparkapplication<span class="w">
</span>--namespace<span class="o">=</span>spark-operator
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>kubectl<span class="w"> </span>get<span
class="w"> </span>sparkapp<span class="w"> </span>spark-pi
+
+NAME<span class="w"> </span>STATUS<span class="w">
</span>ATTEMPTS<span class="w"> </span>START<span class="w">
</span>FINISH<span class="w"> </span>AGE
+spark-pi<span class="w"> </span>RUNNING<span class="w"> </span><span
class="m">1</span><span class="w"> </span><span
class="m">2025</span>-03-18T21:19:48Z<span class="w"> </span><no<span
class="w"> </span>value><span class="w"> </span>65s
+</pre></div>
+</div>
+<p>To check more runtime details</p>
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>kubectl<span class="w">
</span>describe<span class="w"> </span>sparkapplication<span class="w">
</span>spark-pi
+
+....
+Events:
+<span class="w"> </span>Type<span class="w"> </span>Reason<span class="w">
</span>Age<span class="w"> </span>From<span class="w">
</span>Message
+<span class="w"> </span>----<span class="w"> </span>------<span class="w">
</span>----<span class="w"> </span>----<span class="w">
</span>-------
+<span class="w"> </span>Normal<span class="w">
</span>SparkApplicationSubmitted<span class="w"> </span>8m15s<span class="w">
</span>spark-application-controller<span class="w">
</span>SparkApplication<span class="w"> </span>spark-pi<span class="w">
</span>was<span class="w"> </span>submitted<span class="w"> </span>successfully
+<span class="w"> </span>Normal<span class="w">
</span>SparkDriverRunning<span class="w"> </span>7m18s<span class="w">
</span>spark-application-controller<span class="w"> </span>Driver<span
class="w"> </span>spark-pi-driver<span class="w"> </span>is<span class="w">
</span>running
+<span class="w"> </span>Normal<span class="w">
</span>SparkExecutorPending<span class="w"> </span>7m11s<span class="w">
</span>spark-application-controller<span class="w"> </span>Executor<span
class="w"> </span><span class="o">[</span>spark-pi-68732195ab217303-exec-1<span
class="o">]</span><span class="w"> </span>is<span class="w"> </span>pending
+<span class="w"> </span>Normal<span class="w">
</span>SparkExecutorRunning<span class="w"> </span>7m10s<span class="w">
</span>spark-application-controller<span class="w"> </span>Executor<span
class="w"> </span><span class="o">[</span>spark-pi-68732195ab217303-exec-1<span
class="o">]</span><span class="w"> </span>is<span class="w"> </span>running
+<span class="w"> </span>Normal<span class="w">
</span>SparkExecutorCompleted<span class="w"> </span>7m5s<span class="w">
</span>spark-application-controller<span class="w"> </span>Executor<span
class="w"> </span><span class="o">[</span>spark-pi-68732195ab217303-exec-1<span
class="o">]</span><span class="w"> </span>completed
+<span class="w"> </span>Normal<span class="w">
</span>SparkDriverCompleted<span class="w"> </span>7m4s<span class="w">
</span>spark-application-controller<span class="w"> </span>Driver<span
class="w"> </span>spark-pi-driver<span class="w"> </span>completed
+</pre></div>
+</div>
+<p>Get Driver Logs</p>
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>kubectl<span class="w"> </span>logs<span
class="w"> </span>spark-pi-driver
</pre></div>
</div>
<p>More info on <a class="reference external"
href="https://www.kubeflow.org/docs/components/spark-operator/getting-started/">Kube
Spark operator</a></p>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]