This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new a742fb3ee Publish built docs triggered by 
f4bde1c4070f6ff619e88d62eded397c7926596f
a742fb3ee is described below

commit a742fb3eecaa32fa7a614e115900f5e48326f6a7
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Thu Apr 10 14:26:52 2025 +0000

    Publish built docs triggered by f4bde1c4070f6ff619e88d62eded397c7926596f
---
 _sources/contributor-guide/debugging.md.txt   | 2 +-
 _sources/contributor-guide/development.md.txt | 6 +++---
 _sources/user-guide/datasources.md.txt        | 4 ++--
 _sources/user-guide/installation.md.txt       | 4 ++--
 _sources/user-guide/source.md.txt             | 8 ++++----
 contributor-guide/debugging.html              | 2 +-
 contributor-guide/development.html            | 6 +++---
 searchindex.js                                | 2 +-
 user-guide/datasources.html                   | 4 ++--
 user-guide/installation.html                  | 4 ++--
 user-guide/source.html                        | 8 ++++----
 11 files changed, 25 insertions(+), 25 deletions(-)

diff --git a/_sources/contributor-guide/debugging.md.txt 
b/_sources/contributor-guide/debugging.md.txt
index 2f958ce4b..94bdb072b 100644
--- a/_sources/contributor-guide/debugging.md.txt
+++ b/_sources/contributor-guide/debugging.md.txt
@@ -130,7 +130,7 @@ Then build the Comet as 
[described](https://github.com/apache/arrow-datafusion-c
 Start Comet with `RUST_BACKTRACE=1`
 
 ```console
-RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars 
spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar --conf 
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true 
--conf spark.comet.exec.enabled=true
+RUST_BACKTRACE=1 $SPARK_HOME/spark-shell --jars 
spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar --conf 
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true 
--conf spark.comet.exec.enabled=true
 ```
 
 Get the expanded exception details
diff --git a/_sources/contributor-guide/development.md.txt 
b/_sources/contributor-guide/development.md.txt
index a9c43103b..93d2c709d 100644
--- a/_sources/contributor-guide/development.md.txt
+++ b/_sources/contributor-guide/development.md.txt
@@ -109,7 +109,7 @@ The tests can be run with:
 
 ```sh
 export SPARK_HOME=`pwd`
-./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -nsu 
test
+./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" 
-Pspark-3.4 -nsu test
 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" 
-Pspark-3.5 -nsu test
 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" 
-Pspark-4.0 -nsu test
 ```
@@ -117,7 +117,7 @@ export SPARK_HOME=`pwd`
 and
 ```sh
 export SPARK_HOME=`pwd`
-./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" -nsu 
test
+./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" 
-Pspark-3.4 -nsu test
 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" 
-Pspark-3.5 -nsu test
 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite" 
-Pspark-4.0 -nsu test
 ```
@@ -127,7 +127,7 @@ To regenerate the golden files, you can run the following 
commands.
 
 ```sh
 export SPARK_HOME=`pwd`
-SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" -nsu 
test
+SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" 
-Pspark-3.4 -nsu test
 SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" 
-Pspark-3.5 -nsu test
 SPARK_GENERATE_GOLDEN_FILES=1 ./mvnw -pl spark 
-Dsuites="org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite" 
-Pspark-4.0 -nsu test
 ```
diff --git a/_sources/user-guide/datasources.md.txt 
b/_sources/user-guide/datasources.md.txt
index 5cca4b505..5634df8e0 100644
--- a/_sources/user-guide/datasources.md.txt
+++ b/_sources/user-guide/datasources.md.txt
@@ -51,12 +51,12 @@ Unlike to native Comet reader the Datafusion reader fully 
supports nested types
 To build Comet with native DataFusion reader and remote HDFS support it is 
required to have a JDK installed
 
 Example:
-Build a Comet for `spark-3.4` provide a JDK path in `JAVA_HOME` 
+Build a Comet for `spark-3.5` provide a JDK path in `JAVA_HOME` 
 Provide the JRE linker path in `RUSTFLAGS`, the path can vary depending on the 
system. Typically JRE linker is a part of installed JDK
 
 ```shell
 export JAVA_HOME="/opt/homebrew/opt/openjdk@11"
-make release PROFILES="-Pspark-3.4" COMET_FEATURES=hdfs RUSTFLAGS="-L 
$JAVA_HOME/libexec/openjdk.jdk/Contents/Home/lib/server"
+make release PROFILES="-Pspark-3.5" COMET_FEATURES=hdfs RUSTFLAGS="-L 
$JAVA_HOME/libexec/openjdk.jdk/Contents/Home/lib/server"
 ```
 
 Start Comet with experimental reader and HDFS support as 
[described](installation.md/#run-spark-shell-with-comet-enabled)
diff --git a/_sources/user-guide/installation.md.txt 
b/_sources/user-guide/installation.md.txt
index d42b8de9a..74488bc60 100644
--- a/_sources/user-guide/installation.md.txt
+++ b/_sources/user-guide/installation.md.txt
@@ -85,7 +85,7 @@ See the [Comet Kubernetes Guide](kubernetes.md) guide.
 Make sure `SPARK_HOME` points to the same Spark version as Comet was built for.
 
 ```shell
-export COMET_JAR=spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
+export COMET_JAR=spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar
 
 $SPARK_HOME/bin/spark-shell \
     --jars $COMET_JAR \
@@ -141,7 +141,7 @@ explicitly contain Comet otherwise Spark may use a 
different class-loader for th
 components which will then fail at runtime. For example:
 
 ```
---driver-class-path spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
+--driver-class-path spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar
 ```
 
 Some cluster managers may require additional configuration, see 
<https://spark.apache.org/docs/latest/cluster-overview.html>
diff --git a/_sources/user-guide/source.md.txt 
b/_sources/user-guide/source.md.txt
index ab7f89745..b7038d341 100644
--- a/_sources/user-guide/source.md.txt
+++ b/_sources/user-guide/source.md.txt
@@ -38,7 +38,7 @@ cd apache-datafusion-comet-$COMET_VERSION
 Build
 
 ```console
-make release-nogit PROFILES="-Pspark-3.4"
+make release-nogit PROFILES="-Pspark-3.5"
 ```
 
 ## Building from the GitHub repository
@@ -53,17 +53,17 @@ Build Comet for a specific Spark version:
 
 ```console
 cd datafusion-comet
-make release PROFILES="-Pspark-3.4"
+make release PROFILES="-Pspark-3.5"
 ```
 
 Note that the project builds for Scala 2.12 by default but can be built for 
Scala 2.13 using an additional profile:
 
 ```console
-make release PROFILES="-Pspark-3.4 -Pscala-2.13"
+make release PROFILES="-Pspark-3.5 -Pscala-2.13"
 ```
 
 To build Comet from the source distribution on an isolated environment without 
an access to `github.com` it is necessary to disable 
`git-commit-id-maven-plugin`, otherwise you will face errors that there is no 
access to the git during the build process. In that case you may use:
 
 ```console
-make release-nogit PROFILES="-Pspark-3.4"
+make release-nogit PROFILES="-Pspark-3.5"
 ```
diff --git a/contributor-guide/debugging.html b/contributor-guide/debugging.html
index cae5b1660..507eae1dd 100644
--- a/contributor-guide/debugging.html
+++ b/contributor-guide/debugging.html
@@ -461,7 +461,7 @@ To enable this option with Comet it is needed to include 
<code class="docutils l
 </div>
 <p>Then build the Comet as <a class="reference external" 
href="https://github.com/apache/arrow-datafusion-comet/blob/main/README.md#getting-started";>described</a></p>
 <p>Start Comet with <code class="docutils literal notranslate"><span 
class="pre">RUST_BACKTRACE=1</span></code></p>
-<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">RUST_BACKTRACE=1 
$SPARK_HOME/spark-shell --jars 
spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar --conf 
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true 
--conf spark.comet.exec.enabled=true</span>
+<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">RUST_BACKTRACE=1 
$SPARK_HOME/spark-shell --jars 
spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar --conf 
spark.plugins=org.apache.spark.CometPlugin --conf spark.comet.enabled=true 
--conf spark.comet.exec.enabled=true</span>
 </pre></div>
 </div>
 <p>Get the expanded exception details</p>
diff --git a/contributor-guide/development.html 
b/contributor-guide/development.html
index d32dcad7c..081828449 100644
--- a/contributor-guide/development.html
+++ b/contributor-guide/development.html
@@ -479,14 +479,14 @@ The plan stability testing framework is located in the 
<code class="docutils lit
 <p>Note that the output files get written to <code class="docutils literal 
notranslate"><span class="pre">$SPARK_HOME</span></code>.</p>
 <p>The tests can be run with:</p>
 <div class="highlight-sh notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">SPARK_HOME</span><span 
class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span 
class="sb">`</span>
-./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-nsu<span class="w"> </span><span class="nb">test</span>
+./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-3.4<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 ./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-3.5<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 ./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-4.0<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 </pre></div>
 </div>
 <p>and</p>
 <div class="highlight-sh notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">SPARK_HOME</span><span 
class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span 
class="sb">`</span>
-./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-nsu<span class="w"> </span><span class="nb">test</span>
+./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-3.4<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 ./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-3.5<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 ./mvnw<span class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV2_7_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-4.0<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 </pre></div>
@@ -494,7 +494,7 @@ The plan stability testing framework is located in the 
<code class="docutils lit
 <p>If your pull request changes the query plans generated by Comet, you should 
regenerate the golden files.
 To regenerate the golden files, you can run the following commands.</p>
 <div class="highlight-sh notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">SPARK_HOME</span><span 
class="o">=</span><span class="sb">`</span><span class="nb">pwd</span><span 
class="sb">`</span>
-<span class="nv">SPARK_GENERATE_GOLDEN_FILES</span><span 
class="o">=</span><span class="m">1</span><span class="w"> </span>./mvnw<span 
class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-nsu<span class="w"> </span><span class="nb">test</span>
+<span class="nv">SPARK_GENERATE_GOLDEN_FILES</span><span 
class="o">=</span><span class="m">1</span><span class="w"> </span>./mvnw<span 
class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-3.4<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 <span class="nv">SPARK_GENERATE_GOLDEN_FILES</span><span 
class="o">=</span><span class="m">1</span><span class="w"> </span>./mvnw<span 
class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-3.5<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 <span class="nv">SPARK_GENERATE_GOLDEN_FILES</span><span 
class="o">=</span><span class="m">1</span><span class="w"> </span>./mvnw<span 
class="w"> </span>-pl<span class="w"> </span>spark<span class="w"> 
</span>-Dsuites<span class="o">=</span><span 
class="s2">&quot;org.apache.spark.sql.comet.CometTPCDSV1_4_PlanStabilitySuite&quot;</span><span
 class="w"> </span>-Pspark-4.0<span class="w"> </span>-nsu<span class="w"> 
</span><span class="nb">test</span>
 </pre></div>
diff --git a/searchindex.js b/searchindex.js
index 35b3f52da..ae5d8c3d4 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[10, "install-comet"]], 
"2. Clone Spark and Apply Diff": [[10, "clone-spark-and-apply-diff"]], "3. Run 
Spark SQL Tests": [[10, "run-spark-sql-tests"]], "ANSI mode": [[12, 
"ansi-mode"]], "API Differences Between Spark Versions": [[0, 
"api-differences-between-spark-versions"]], "ASF Links": [[11, null]], "Adding 
Spark-side Tests for the New Expression": [[0, 
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression": 
[[ [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[10, "install-comet"]], 
"2. Clone Spark and Apply Diff": [[10, "clone-spark-and-apply-diff"]], "3. Run 
Spark SQL Tests": [[10, "run-spark-sql-tests"]], "ANSI mode": [[12, 
"ansi-mode"]], "API Differences Between Spark Versions": [[0, 
"api-differences-between-spark-versions"]], "ASF Links": [[11, null]], "Adding 
Spark-side Tests for the New Expression": [[0, 
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression": 
[[ [...]
\ No newline at end of file
diff --git a/user-guide/datasources.html b/user-guide/datasources.html
index 0028332aa..332fc0470 100644
--- a/user-guide/datasources.html
+++ b/user-guide/datasources.html
@@ -417,10 +417,10 @@ converted into Arrow format, allowing native execution to 
happen after that.</p>
 <p>Unlike to native Comet reader the Datafusion reader fully supports nested 
types processing. This reader is currently experimental only</p>
 <p>To build Comet with native DataFusion reader and remote HDFS support it is 
required to have a JDK installed</p>
 <p>Example:
-Build a Comet for <code class="docutils literal notranslate"><span 
class="pre">spark-3.4</span></code> provide a JDK path in <code class="docutils 
literal notranslate"><span class="pre">JAVA_HOME</span></code>
+Build a Comet for <code class="docutils literal notranslate"><span 
class="pre">spark-3.5</span></code> provide a JDK path in <code class="docutils 
literal notranslate"><span class="pre">JAVA_HOME</span></code>
 Provide the JRE linker path in <code class="docutils literal 
notranslate"><span class="pre">RUSTFLAGS</span></code>, the path can vary 
depending on the system. Typically JRE linker is a part of installed JDK</p>
 <div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">JAVA_HOME</span><span 
class="o">=</span><span 
class="s2">&quot;/opt/homebrew/opt/openjdk@11&quot;</span>
-make<span class="w"> </span>release<span class="w"> </span><span 
class="nv">PROFILES</span><span class="o">=</span><span 
class="s2">&quot;-Pspark-3.4&quot;</span><span class="w"> </span><span 
class="nv">COMET_FEATURES</span><span class="o">=</span>hdfs<span class="w"> 
</span><span class="nv">RUSTFLAGS</span><span class="o">=</span><span 
class="s2">&quot;-L </span><span class="nv">$JAVA_HOME</span><span 
class="s2">/libexec/openjdk.jdk/Contents/Home/lib/server&quot;</span>
+make<span class="w"> </span>release<span class="w"> </span><span 
class="nv">PROFILES</span><span class="o">=</span><span 
class="s2">&quot;-Pspark-3.5&quot;</span><span class="w"> </span><span 
class="nv">COMET_FEATURES</span><span class="o">=</span>hdfs<span class="w"> 
</span><span class="nv">RUSTFLAGS</span><span class="o">=</span><span 
class="s2">&quot;-L </span><span class="nv">$JAVA_HOME</span><span 
class="s2">/libexec/openjdk.jdk/Contents/Home/lib/server&quot;</span>
 </pre></div>
 </div>
 <p>Start Comet with experimental reader and HDFS support as <a 
class="reference internal" 
href="installation.html#run-spark-shell-with-comet-enabled"><span class="std 
std-ref">described</span></a>
diff --git a/user-guide/installation.html b/user-guide/installation.html
index 80ed2deee..d147fe910 100644
--- a/user-guide/installation.html
+++ b/user-guide/installation.html
@@ -498,7 +498,7 @@ source releases, or from the latest code in the GitHub 
repository.</p>
 <section id="run-spark-shell-with-comet-enabled">
 <h2>Run Spark Shell with Comet enabled<a class="headerlink" 
href="#run-spark-shell-with-comet-enabled" title="Link to this 
heading">ΒΆ</a></h2>
 <p>Make sure <code class="docutils literal notranslate"><span 
class="pre">SPARK_HOME</span></code> points to the same Spark version as Comet 
was built for.</p>
-<div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">COMET_JAR</span><span 
class="o">=</span>spark/target/comet-spark-spark3.4_2.12-0.8.0-SNAPSHOT.jar
+<div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">COMET_JAR</span><span 
class="o">=</span>spark/target/comet-spark-spark3.5_2.12-0.8.0-SNAPSHOT.jar
 
 <span class="nv">$SPARK_HOME</span>/bin/spark-shell<span class="w"> 
</span><span class="se">\</span>
 <span class="w">    </span>--jars<span class="w"> </span><span 
class="nv">$COMET_JAR</span><span class="w"> </span><span class="se">\</span>
@@ -549,7 +549,7 @@ being executed natively.</p>
 <p>Depending on your deployment mode you may also need to set the driver &amp; 
executor class path(s) to
 explicitly contain Comet otherwise Spark may use a different class-loader for 
the Comet components than its internal
 components which will then fail at runtime. For example:</p>
-<div class="highlight-default notranslate"><div 
class="highlight"><pre><span></span><span class="o">--</span><span 
class="n">driver</span><span class="o">-</span><span 
class="n">class</span><span class="o">-</span><span class="n">path</span> <span 
class="n">spark</span><span class="o">/</span><span 
class="n">target</span><span class="o">/</span><span 
class="n">comet</span><span class="o">-</span><span class="n">spark</span><span 
class="o">-</span><span class="n">spark3</span><span class= [...]
+<div class="highlight-default notranslate"><div 
class="highlight"><pre><span></span><span class="o">--</span><span 
class="n">driver</span><span class="o">-</span><span 
class="n">class</span><span class="o">-</span><span class="n">path</span> <span 
class="n">spark</span><span class="o">/</span><span 
class="n">target</span><span class="o">/</span><span 
class="n">comet</span><span class="o">-</span><span class="n">spark</span><span 
class="o">-</span><span class="n">spark3</span><span class= [...]
 </pre></div>
 </div>
 <p>Some cluster managers may require additional configuration, see <a 
class="reference external" 
href="https://spark.apache.org/docs/latest/cluster-overview.html";>https://spark.apache.org/docs/latest/cluster-overview.html</a></p>
diff --git a/user-guide/source.html b/user-guide/source.html
index b58ae13a5..9fb0976cb 100644
--- a/user-guide/source.html
+++ b/user-guide/source.html
@@ -358,7 +358,7 @@ under the License.
 </pre></div>
 </div>
 <p>Build</p>
-<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">make release-nogit 
PROFILES=&quot;-Pspark-3.4&quot;</span>
+<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">make release-nogit 
PROFILES=&quot;-Pspark-3.5&quot;</span>
 </pre></div>
 </div>
 </section>
@@ -370,15 +370,15 @@ under the License.
 </div>
 <p>Build Comet for a specific Spark version:</p>
 <div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">cd datafusion-comet</span>
-<span class="go">make release PROFILES=&quot;-Pspark-3.4&quot;</span>
+<span class="go">make release PROFILES=&quot;-Pspark-3.5&quot;</span>
 </pre></div>
 </div>
 <p>Note that the project builds for Scala 2.12 by default but can be built for 
Scala 2.13 using an additional profile:</p>
-<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">make release 
PROFILES=&quot;-Pspark-3.4 -Pscala-2.13&quot;</span>
+<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">make release 
PROFILES=&quot;-Pspark-3.5 -Pscala-2.13&quot;</span>
 </pre></div>
 </div>
 <p>To build Comet from the source distribution on an isolated environment 
without an access to <code class="docutils literal notranslate"><span 
class="pre">github.com</span></code> it is necessary to disable <code 
class="docutils literal notranslate"><span 
class="pre">git-commit-id-maven-plugin</span></code>, otherwise you will face 
errors that there is no access to the git during the build process. In that 
case you may use:</p>
-<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">make release-nogit 
PROFILES=&quot;-Pspark-3.4&quot;</span>
+<div class="highlight-console notranslate"><div 
class="highlight"><pre><span></span><span class="go">make release-nogit 
PROFILES=&quot;-Pspark-3.5&quot;</span>
 </pre></div>
 </div>
 </section>


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to