This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git
The following commit(s) were added to refs/heads/asf-site by this push:
new b8647c778 Publish built docs triggered by
f0b36340af13bd98810212f2602db8d1cc2d8469
b8647c778 is described below
commit b8647c77889524d372bcc90ea80d39624c36798f
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Fri Mar 14 22:51:22 2025 +0000
Publish built docs triggered by f0b36340af13bd98810212f2602db8d1cc2d8469
---
_sources/contributor-guide/debugging.md.txt | 2 +-
_sources/user-guide/installation.md.txt | 4 ++--
contributor-guide/debugging.html | 2 +-
searchindex.js | 2 +-
user-guide/installation.html | 4 ++--
5 files changed, 7 insertions(+), 7 deletions(-)
diff --git a/_sources/contributor-guide/debugging.md.txt
b/_sources/contributor-guide/debugging.md.txt
index 21be9396d..2f958ce4b 100644
--- a/_sources/contributor-guide/debugging.md.txt
+++ b/_sources/contributor-guide/debugging.md.txt
@@ -130,7 +130,7 @@ Then build the Comet as
[described](https://github.com/apache/arrow-datafusion-c
Start Comet with `RUST_BACKTRACE=1`
```console
-RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars
spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar --conf
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true
--conf spark.comet.exec.enabled=true
+RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars
spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar --conf
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true
--conf spark.comet.exec.enabled=true
```
Get the expanded exception details
diff --git a/_sources/user-guide/installation.md.txt
b/_sources/user-guide/installation.md.txt
index cccf95056..937280ae4 100644
--- a/_sources/user-guide/installation.md.txt
+++ b/_sources/user-guide/installation.md.txt
@@ -74,7 +74,7 @@ See the [Comet Kubernetes Guide](kubernetes.md) guide.
Make sure `SPARK_HOME` points to the same Spark version as Comet was built for.
```console
-export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar
+export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
$SPARK_HOME/bin/spark-shell \
--jars $COMET_JAR \
@@ -130,7 +130,7 @@ explicitly contain Comet otherwise Spark may use a
different class-loader for th
components which will then fail at runtime. For example:
```
---driver-class-path spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar
+--driver-class-path spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
```
Some cluster managers may require additional configuration, see
<https://spark.apache.org/docs/latest/cluster-overview.html>
diff --git a/contributor-guide/debugging.html b/contributor-guide/debugging.html
index 2f3b4b009..cae5b1660 100644
--- a/contributor-guide/debugging.html
+++ b/contributor-guide/debugging.html
@@ -461,7 +461,7 @@ To enable this option with Comet it is needed to include
<code class="docutils l
</div>
<p>Then build the Comet as <a class="reference external"
href="https://github.com/apache/arrow-datafusion-comet/blob/main/README.md#getting-started">described</a></p>
<p>Start Comet with <code class="docutils literal notranslate"><span
class="pre">RUST_BACKTRACE=1</span></code></p>
-<div class="highlight-console notranslate"><div
class="highlight"><pre><span></span><span class="go">RUST_BACKTRACE=1
$SPARK_HOME/spark-shell --jars
spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar --conf
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true
--conf spark.comet.exec.enabled=true</span>
+<div class="highlight-console notranslate"><div
class="highlight"><pre><span></span><span class="go">RUST_BACKTRACE=1
$SPARK_HOME/spark-shell --jars
spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar --conf
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true
--conf spark.comet.exec.enabled=true</span>
</pre></div>
</div>
<p>Get the expanded exception details</p>
diff --git a/searchindex.js b/searchindex.js
index 1f305bc11..0296f2106 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[9, "install-comet"]], "2.
Clone Spark and Apply Diff": [[9, "clone-spark-and-apply-diff"]], "3. Run Spark
SQL Tests": [[9, "run-spark-sql-tests"]], "ANSI mode": [[11, "ansi-mode"]],
"API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[10, null]], "Adding
Spark-side Tests for the New Expression": [[0,
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression":
[[0, [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[9, "install-comet"]], "2.
Clone Spark and Apply Diff": [[9, "clone-spark-and-apply-diff"]], "3. Run Spark
SQL Tests": [[9, "run-spark-sql-tests"]], "ANSI mode": [[11, "ansi-mode"]],
"API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[10, null]], "Adding
Spark-side Tests for the New Expression": [[0,
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression":
[[0, [...]
\ No newline at end of file
diff --git a/user-guide/installation.html b/user-guide/installation.html
index 065befbf5..4e3aa87f6 100644
--- a/user-guide/installation.html
+++ b/user-guide/installation.html
@@ -443,7 +443,7 @@ source releases, or from the latest code in the GitHub
repository.</p>
<section id="run-spark-shell-with-comet-enabled">
<h2>Run Spark Shell with Comet enabled<a class="headerlink"
href="#run-spark-shell-with-comet-enabled" title="Link to this
heading">ΒΆ</a></h2>
<p>Make sure <code class="docutils literal notranslate"><span
class="pre">SPARK_HOME</span></code> points to the same Spark version as Comet
was built for.</p>
-<div class="highlight-console notranslate"><div
class="highlight"><pre><span></span><span class="go">export
COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.7.0-SNAPSHOT.jar</span>
+<div class="highlight-console notranslate"><div
class="highlight"><pre><span></span><span class="go">export
COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar</span>
<span class="gp">$</span>SPARK_HOME/bin/spark-shell<span class="w">
</span><span class="se">\</span>
<span class="w"> </span>--jars<span class="w"> </span><span
class="nv">$COMET_JAR</span><span class="w"> </span><span class="se">\</span>
@@ -494,7 +494,7 @@ being executed natively.</p>
<p>Depending on your deployment mode you may also need to set the driver &
executor class path(s) to
explicitly contain Comet otherwise Spark may use a different class-loader for
the Comet components than its internal
components which will then fail at runtime. For example:</p>
-<div class="highlight-default notranslate"><div
class="highlight"><pre><span></span><span class="o">--</span><span
class="n">driver</span><span class="o">-</span><span
class="n">class</span><span class="o">-</span><span class="n">path</span> <span
class="n">spark</span><span class="o">/</span><span
class="n">target</span><span class="o">/</span><span
class="n">comet</span><span class="o">-</span><span class="n">spark</span><span
class="o">-</span><span class="n">spark3</span><span class= [...]
+<div class="highlight-default notranslate"><div
class="highlight"><pre><span></span><span class="o">--</span><span
class="n">driver</span><span class="o">-</span><span
class="n">class</span><span class="o">-</span><span class="n">path</span> <span
class="n">spark</span><span class="o">/</span><span
class="n">target</span><span class="o">/</span><span
class="n">comet</span><span class="o">-</span><span class="n">spark</span><span
class="o">-</span><span class="n">spark3</span><span class= [...]
</pre></div>
</div>
<p>Some cluster managers may require additional configuration, see <a
class="reference external"
href="https://spark.apache.org/docs/latest/cluster-overview.html">https://spark.apache.org/docs/latest/cluster-overview.html</a></p>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]