This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 39e3651da Publish built docs triggered by 
2c0dc40ba1275117bbc15a28f18925b1ec06228f
39e3651da is described below

commit 39e3651da90cc6d273200d810382b6c61e8a3222
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Thu Aug 7 18:50:27 2025 +0000

    Publish built docs triggered by 2c0dc40ba1275117bbc15a28f18925b1ec06228f
---
 _sources/contributor-guide/benchmarking_macos.md.txt | 10 ++++++++++
 contributor-guide/benchmarking_macos.html            |  8 ++++++++
 searchindex.js                                       |  2 +-
 3 files changed, 19 insertions(+), 1 deletion(-)

diff --git a/_sources/contributor-guide/benchmarking_macos.md.txt 
b/_sources/contributor-guide/benchmarking_macos.md.txt
index 0f44babd8..a6e712ea9 100644
--- a/_sources/contributor-guide/benchmarking_macos.md.txt
+++ b/_sources/contributor-guide/benchmarking_macos.md.txt
@@ -61,6 +61,16 @@ Start Spark in standalone mode:
 $SPARK_HOME/sbin/start-master.sh
 ```
 
+Note: For Apache Spark distributions installed using `brew` tool, it may 
happen there is no `$SPARK_HOME/sbin` folder on your machine. However it is 
still possible to start Apache Spark master by running command
+```shell
+$SPARK_HOME/bin/spark-class org.apache.spark.deploy.master.Master
+```
+
+Once master has started, look for output to find Master endpoint URI, like 
+```shell
+INFO Master: Starting Spark master at spark://192.168.4.142:7078
+```
+
 Set `SPARK_MASTER` env var (host name will need to be edited):
 
 ```shell
diff --git a/contributor-guide/benchmarking_macos.html 
b/contributor-guide/benchmarking_macos.html
index f2e3f7db5..960988546 100644
--- a/contributor-guide/benchmarking_macos.html
+++ b/contributor-guide/benchmarking_macos.html
@@ -407,6 +407,14 @@ mkdir<span class="w"> </span>/tmp/spark-events
 <div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span><span 
class="nv">$SPARK_HOME</span>/sbin/start-master.sh
 </pre></div>
 </div>
+<p>Note: For Apache Spark distributions installed using <code class="docutils 
literal notranslate"><span class="pre">brew</span></code> tool, it may happen 
there is no <code class="docutils literal notranslate"><span 
class="pre">$SPARK_HOME/sbin</span></code> folder on your machine. However it 
is still possible to start Apache Spark master by running command</p>
+<div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span><span 
class="nv">$SPARK_HOME</span>/bin/spark-class<span class="w"> 
</span>org.apache.spark.deploy.master.Master
+</pre></div>
+</div>
+<p>Once master has started, look for output to find Master endpoint URI, 
like</p>
+<div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span>INFO<span class="w"> </span>Master:<span 
class="w"> </span>Starting<span class="w"> </span>Spark<span class="w"> 
</span>master<span class="w"> </span>at<span class="w"> 
</span>spark://192.168.4.142:7078
+</pre></div>
+</div>
 <p>Set <code class="docutils literal notranslate"><span 
class="pre">SPARK_MASTER</span></code> env var (host name will need to be 
edited):</p>
 <div class="highlight-shell notranslate"><div 
class="highlight"><pre><span></span><span class="nb">export</span><span 
class="w"> </span><span class="nv">SPARK_MASTER</span><span 
class="o">=</span>spark://Rustys-MacBook-Pro.local:7077
 </pre></div>
diff --git a/searchindex.js b/searchindex.js
index 2216bafc2..c0b0ea622 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[11, "install-comet"]], 
"2. Clone Spark and Apply Diff": [[11, "clone-spark-and-apply-diff"]], "3. Run 
Spark SQL Tests": [[11, "run-spark-sql-tests"]], "ANSI Mode": [[14, 
"ansi-mode"]], "API Differences Between Spark Versions": [[0, 
"api-differences-between-spark-versions"]], "ASF Links": [[13, null]], 
"Accelerating Apache Iceberg Parquet Scans using Comet (Experimental)": [[20, 
null]], "Adding Spark-side Tests for the New Expression":  [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[11, "install-comet"]], 
"2. Clone Spark and Apply Diff": [[11, "clone-spark-and-apply-diff"]], "3. Run 
Spark SQL Tests": [[11, "run-spark-sql-tests"]], "ANSI Mode": [[14, 
"ansi-mode"]], "API Differences Between Spark Versions": [[0, 
"api-differences-between-spark-versions"]], "ASF Links": [[13, null]], 
"Accelerating Apache Iceberg Parquet Scans using Comet (Experimental)": [[20, 
null]], "Adding Spark-side Tests for the New Expression":  [...]
\ No newline at end of file


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@datafusion.apache.org
For additional commands, e-mail: commits-h...@datafusion.apache.org

Reply via email to