This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 45ce9b7d7 Publish built docs triggered by
de0be4b4c41a4e2d234c8f50d89d172c60064e6b
45ce9b7d7 is described below
commit 45ce9b7d7acb2193917e0aca01647aaf2f3890be
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Tue Feb 18 01:25:57 2025 +0000
Publish built docs triggered by de0be4b4c41a4e2d234c8f50d89d172c60064e6b
---
_sources/user-guide/datasources.md.txt | 78 ++++++++++++++
contributor-guide/adding_a_new_expression.html | 2 +-
contributor-guide/benchmark-results/tpc-ds.html | 2 +-
contributor-guide/benchmark-results/tpc-h.html | 2 +-
contributor-guide/benchmarking.html | 2 +-
contributor-guide/contributing.html | 2 +-
contributor-guide/debugging.html | 2 +-
contributor-guide/development.html | 2 +-
contributor-guide/plugin_overview.html | 2 +-
contributor-guide/profiling_native_code.html | 2 +-
contributor-guide/spark-sql-tests.html | 2 +-
genindex.html | 2 +-
index.html | 5 +-
search.html | 2 +-
searchindex.js | 2 +-
user-guide/compatibility.html | 2 +-
user-guide/configs.html | 2 +-
user-guide/datasources.html | 131 +++++++++++++++++++++---
user-guide/datatypes.html | 2 +-
user-guide/expressions.html | 2 +-
user-guide/installation.html | 2 +-
user-guide/kubernetes.html | 2 +-
user-guide/metrics.html | 2 +-
user-guide/operators.html | 2 +-
user-guide/overview.html | 2 +-
user-guide/source.html | 2 +-
user-guide/tuning.html | 2 +-
27 files changed, 224 insertions(+), 38 deletions(-)
diff --git a/_sources/user-guide/datasources.md.txt
b/_sources/user-guide/datasources.md.txt
index 9607ba603..27c5492d8 100644
--- a/_sources/user-guide/datasources.md.txt
+++ b/_sources/user-guide/datasources.md.txt
@@ -35,3 +35,81 @@ converted into Arrow format, allowing native execution to
happen after that.
Comet does not provide native JSON scan, but when
`spark.comet.convert.json.enabled` is enabled, data is immediately
converted into Arrow format, allowing native execution to happen after that.
+
+# Supported Storages
+
+## Local
+In progress
+
+## HDFS
+
+Apache DataFusion Comet native reader seamlessly scans files from remote HDFS
for [supported formats](#supported-spark-data-sources)
+
+### Using experimental native DataFusion reader
+Unlike to native Comet reader the Datafusion reader fully supports nested
types processing. This reader is currently experimental only
+
+To build Comet with native DataFusion reader and remote HDFS support it is
required to have a JDK installed
+
+Example:
+Build a Comet for `spark-3.4` provide a JDK path in `JAVA_HOME`
+Provide the JRE linker path in `RUSTFLAGS`, the path can vary depending on the
system. Typically JRE linker is a part of installed JDK
+
+```shell
+export JAVA_HOME="/opt/homebrew/opt/openjdk@11"
+make release PROFILES="-Pspark-3.4" COMET_FEATURES=hdfs RUSTFLAGS="-L
$JAVA_HOME/libexec/openjdk.jdk/Contents/Home/lib/server"
+```
+
+Start Comet with experimental reader and HDFS support as
[described](installation.md/#run-spark-shell-with-comet-enabled)
+and add additional parameters
+
+```shell
+--conf spark.comet.scan.impl=native_datafusion \
+--conf spark.hadoop.fs.defaultFS="hdfs://namenode:9000" \
+--conf spark.hadoop.dfs.client.use.datanode.hostname = true \
+--conf dfs.client.use.datanode.hostname = true
+```
+
+Query a struct type from Remote HDFS
+```shell
+spark.read.parquet("hdfs://namenode:9000/user/data").show(false)
+
+root
+ |-- id: integer (nullable = true)
+ |-- first_name: string (nullable = true)
+ |-- personal_info: struct (nullable = true)
+ | |-- firstName: string (nullable = true)
+ | |-- lastName: string (nullable = true)
+ | |-- ageInYears: integer (nullable = true)
+
+25/01/30 16:50:43 INFO core/src/lib.rs: Comet native library version 0.6.0
initialized
+== Physical Plan ==
+* CometColumnarToRow (2)
++- CometNativeScan: (1)
+
+
+(1) CometNativeScan:
+Output [3]: [id#0, first_name#1, personal_info#4]
+Arguments: [id#0, first_name#1, personal_info#4]
+
+(2) CometColumnarToRow [codegen id : 1]
+Input [3]: [id#0, first_name#1, personal_info#4]
+
+
+25/01/30 16:50:44 INFO fs-hdfs-0.1.12/src/hdfs.rs: Connecting to Namenode
(hdfs://namenode:9000)
++---+----------+-----------------+
+|id |first_name|personal_info |
++---+----------+-----------------+
+|2 |Jane |{Jane, Smith, 34}|
+|1 |John |{John, Doe, 28} |
++---+----------+-----------------+
+
+
+
+```
+
+Verify the native scan type should be `CometNativeScan`.
+
+More on [HDFS Reader](../../../native/hdfs/README.md)
+
+## S3
+In progress
diff --git a/contributor-guide/adding_a_new_expression.html
b/contributor-guide/adding_a_new_expression.html
index 23f88d096..b34090c27 100644
--- a/contributor-guide/adding_a_new_expression.html
+++ b/contributor-guide/adding_a_new_expression.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/benchmark-results/tpc-ds.html
b/contributor-guide/benchmark-results/tpc-ds.html
index 5af208c08..bdd7c8e34 100644
--- a/contributor-guide/benchmark-results/tpc-ds.html
+++ b/contributor-guide/benchmark-results/tpc-ds.html
@@ -128,7 +128,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/benchmark-results/tpc-h.html
b/contributor-guide/benchmark-results/tpc-h.html
index 1660f91e3..f6e53d9ed 100644
--- a/contributor-guide/benchmark-results/tpc-h.html
+++ b/contributor-guide/benchmark-results/tpc-h.html
@@ -128,7 +128,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/benchmarking.html
b/contributor-guide/benchmarking.html
index 7807a9049..f5d907631 100644
--- a/contributor-guide/benchmarking.html
+++ b/contributor-guide/benchmarking.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/contributing.html
b/contributor-guide/contributing.html
index d4848d73e..5f3ac30b1 100644
--- a/contributor-guide/contributing.html
+++ b/contributor-guide/contributing.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/debugging.html b/contributor-guide/debugging.html
index cc26aa3e1..09f77950f 100644
--- a/contributor-guide/debugging.html
+++ b/contributor-guide/debugging.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/development.html
b/contributor-guide/development.html
index 13eb09026..c0b1d44c1 100644
--- a/contributor-guide/development.html
+++ b/contributor-guide/development.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/plugin_overview.html
b/contributor-guide/plugin_overview.html
index 33ef28173..4218d0b35 100644
--- a/contributor-guide/plugin_overview.html
+++ b/contributor-guide/plugin_overview.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/profiling_native_code.html
b/contributor-guide/profiling_native_code.html
index 014fb9c95..a2683a7f4 100644
--- a/contributor-guide/profiling_native_code.html
+++ b/contributor-guide/profiling_native_code.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/contributor-guide/spark-sql-tests.html
b/contributor-guide/spark-sql-tests.html
index b66325493..d1ade3f93 100644
--- a/contributor-guide/spark-sql-tests.html
+++ b/contributor-guide/spark-sql-tests.html
@@ -129,7 +129,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="../user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/genindex.html b/genindex.html
index 4d85af0cb..1827062ab 100644
--- a/genindex.html
+++ b/genindex.html
@@ -127,7 +127,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/index.html b/index.html
index 22616d4f8..ce8d9dc04 100644
--- a/index.html
+++ b/index.html
@@ -129,7 +129,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
@@ -325,7 +325,8 @@ as a native runtime to achieve improvement in terms of
query efficiency and quer
<li class="toctree-l1"><a class="reference internal"
href="user-guide/installation.html">Installing Comet</a></li>
<li class="toctree-l1"><a class="reference internal"
href="user-guide/source.html">Building From Source</a></li>
<li class="toctree-l1"><a class="reference internal"
href="user-guide/kubernetes.html">Kubernetes Guide</a></li>
-<li class="toctree-l1"><a class="reference internal"
href="user-guide/datasources.html">Supported Data Sources</a></li>
+<li class="toctree-l1"><a class="reference internal"
href="user-guide/datasources.html">Supported Spark Data Sources</a></li>
+<li class="toctree-l1"><a class="reference internal"
href="user-guide/datasources.html#supported-storages">Supported
Storages</a></li>
<li class="toctree-l1"><a class="reference internal"
href="user-guide/datatypes.html">Supported Data Types</a></li>
<li class="toctree-l1"><a class="reference internal"
href="user-guide/operators.html">Supported Operators</a></li>
<li class="toctree-l1"><a class="reference internal"
href="user-guide/expressions.html">Supported Expressions</a></li>
diff --git a/search.html b/search.html
index fb7a5b03e..696b9a643 100644
--- a/search.html
+++ b/search.html
@@ -134,7 +134,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="user-guide/datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/searchindex.js b/searchindex.js
index 4ed4b065f..e6c1cc42f 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[9, "install-comet"]], "2.
Clone Spark and Apply Diff": [[9, "clone-spark-and-apply-diff"]], "3. Run Spark
SQL Tests": [[9, "run-spark-sql-tests"]], "ANSI mode": [[11, "ansi-mode"]],
"API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[10, null]], "Adding
Spark-side Tests for the New Expression": [[0,
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression":
[[0, [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[9, "install-comet"]], "2.
Clone Spark and Apply Diff": [[9, "clone-spark-and-apply-diff"]], "3. Run Spark
SQL Tests": [[9, "run-spark-sql-tests"]], "ANSI mode": [[11, "ansi-mode"]],
"API Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "ASF Links": [[10, null]], "Adding
Spark-side Tests for the New Expression": [[0,
"adding-spark-side-tests-for-the-new-expression"]], "Adding a New Expression":
[[0, [...]
\ No newline at end of file
diff --git a/user-guide/compatibility.html b/user-guide/compatibility.html
index bdf5e96f7..2efc441ec 100644
--- a/user-guide/compatibility.html
+++ b/user-guide/compatibility.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/configs.html b/user-guide/configs.html
index 7ee4d7db6..27923beb3 100644
--- a/user-guide/configs.html
+++ b/user-guide/configs.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/datasources.html b/user-guide/datasources.html
index 1ecb03817..fbdfc089a 100644
--- a/user-guide/datasources.html
+++ b/user-guide/datasources.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1 current active">
<a class="current reference internal" href="#">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
@@ -286,20 +286,56 @@ under the License.
<nav id="bd-toc-nav">
<ul class="visible nav section-nav flex-column">
- <li class="toc-h2 nav-item toc-entry">
- <a class="reference internal nav-link" href="#parquet">
- Parquet
- </a>
- </li>
- <li class="toc-h2 nav-item toc-entry">
- <a class="reference internal nav-link" href="#csv">
- CSV
+ <li class="toc-h1 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#">
+ Supported Spark Data Sources
</a>
+ <ul class="visible nav section-nav flex-column">
+ <li class="toc-h2 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#parquet">
+ Parquet
+ </a>
+ </li>
+ <li class="toc-h2 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#csv">
+ CSV
+ </a>
+ </li>
+ <li class="toc-h2 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#json">
+ JSON
+ </a>
+ </li>
+ </ul>
</li>
- <li class="toc-h2 nav-item toc-entry">
- <a class="reference internal nav-link" href="#json">
- JSON
+ <li class="toc-h1 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#supported-storages">
+ Supported Storages
</a>
+ <ul class="visible nav section-nav flex-column">
+ <li class="toc-h2 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#local">
+ Local
+ </a>
+ </li>
+ <li class="toc-h2 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#hdfs">
+ HDFS
+ </a>
+ <ul class="nav section-nav flex-column">
+ <li class="toc-h3 nav-item toc-entry">
+ <a class="reference internal nav-link"
href="#using-experimental-native-datafusion-reader">
+ Using experimental native DataFusion reader
+ </a>
+ </li>
+ </ul>
+ </li>
+ <li class="toc-h2 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#s3">
+ S3
+ </a>
+ </li>
+ </ul>
</li>
</ul>
@@ -366,6 +402,77 @@ converted into Arrow format, allowing native execution to
happen after that.</p>
<p>Comet does not provide native JSON scan, but when <code class="docutils
literal notranslate"><span
class="pre">spark.comet.convert.json.enabled</span></code> is enabled, data is
immediately
converted into Arrow format, allowing native execution to happen after
that.</p>
</section>
+</section>
+<section id="supported-storages">
+<h1>Supported Storages<a class="headerlink" href="#supported-storages"
title="Link to this heading">¶</a></h1>
+<section id="local">
+<h2>Local<a class="headerlink" href="#local" title="Link to this
heading">¶</a></h2>
+<p>In progress</p>
+</section>
+<section id="hdfs">
+<h2>HDFS<a class="headerlink" href="#hdfs" title="Link to this
heading">¶</a></h2>
+<p>Apache DataFusion Comet native reader seamlessly scans files from remote
HDFS for <a class="reference internal"
href="#supported-spark-data-sources">supported formats</a></p>
+<section id="using-experimental-native-datafusion-reader">
+<h3>Using experimental native DataFusion reader<a class="headerlink"
href="#using-experimental-native-datafusion-reader" title="Link to this
heading">¶</a></h3>
+<p>Unlike to native Comet reader the Datafusion reader fully supports nested
types processing. This reader is currently experimental only</p>
+<p>To build Comet with native DataFusion reader and remote HDFS support it is
required to have a JDK installed</p>
+<p>Example:
+Build a Comet for <code class="docutils literal notranslate"><span
class="pre">spark-3.4</span></code> provide a JDK path in <code class="docutils
literal notranslate"><span class="pre">JAVA_HOME</span></code>
+Provide the JRE linker path in <code class="docutils literal
notranslate"><span class="pre">RUSTFLAGS</span></code>, the path can vary
depending on the system. Typically JRE linker is a part of installed JDK</p>
+<div class="highlight-shell notranslate"><div
class="highlight"><pre><span></span><span class="nb">export</span><span
class="w"> </span><span class="nv">JAVA_HOME</span><span
class="o">=</span><span
class="s2">"/opt/homebrew/opt/openjdk@11"</span>
+make<span class="w"> </span>release<span class="w"> </span><span
class="nv">PROFILES</span><span class="o">=</span><span
class="s2">"-Pspark-3.4"</span><span class="w"> </span><span
class="nv">COMET_FEATURES</span><span class="o">=</span>hdfs<span class="w">
</span><span class="nv">RUSTFLAGS</span><span class="o">=</span><span
class="s2">"-L </span><span class="nv">$JAVA_HOME</span><span
class="s2">/libexec/openjdk.jdk/Contents/Home/lib/server"</span>
+</pre></div>
+</div>
+<p>Start Comet with experimental reader and HDFS support as <a
class="reference internal"
href="installation.html#run-spark-shell-with-comet-enabled"><span class="std
std-ref">described</span></a>
+and add additional parameters</p>
+<div class="highlight-shell notranslate"><div
class="highlight"><pre><span></span>--conf<span class="w">
</span>spark.comet.scan.impl<span class="o">=</span>native_datafusion<span
class="w"> </span><span class="se">\</span>
+--conf<span class="w"> </span>spark.hadoop.fs.defaultFS<span
class="o">=</span><span class="s2">"hdfs://namenode:9000"</span><span
class="w"> </span><span class="se">\</span>
+--conf<span class="w">
</span>spark.hadoop.dfs.client.use.datanode.hostname<span class="w">
</span><span class="o">=</span><span class="w"> </span><span
class="nb">true</span><span class="w"> </span><span class="se">\</span>
+--conf<span class="w"> </span>dfs.client.use.datanode.hostname<span class="w">
</span><span class="o">=</span><span class="w"> </span><span
class="nb">true</span>
+</pre></div>
+</div>
+<p>Query a struct type from Remote HDFS</p>
+<div class="highlight-shell notranslate"><div
class="highlight"><pre><span></span>spark.read.parquet<span
class="o">(</span><span
class="s2">"hdfs://namenode:9000/user/data"</span><span
class="o">)</span>.show<span class="o">(</span><span
class="nb">false</span><span class="o">)</span>
+
+root
+<span class="w"> </span><span class="p">|</span>--<span class="w">
</span>id:<span class="w"> </span>integer<span class="w"> </span><span
class="o">(</span><span class="nv">nullable</span><span class="w"> </span><span
class="o">=</span><span class="w"> </span><span class="nb">true</span><span
class="o">)</span>
+<span class="w"> </span><span class="p">|</span>--<span class="w">
</span>first_name:<span class="w"> </span>string<span class="w"> </span><span
class="o">(</span><span class="nv">nullable</span><span class="w"> </span><span
class="o">=</span><span class="w"> </span><span class="nb">true</span><span
class="o">)</span>
+<span class="w"> </span><span class="p">|</span>--<span class="w">
</span>personal_info:<span class="w"> </span>struct<span class="w">
</span><span class="o">(</span><span class="nv">nullable</span><span class="w">
</span><span class="o">=</span><span class="w"> </span><span
class="nb">true</span><span class="o">)</span>
+<span class="w"> </span><span class="p">|</span><span class="w">
</span><span class="p">|</span>--<span class="w"> </span>firstName:<span
class="w"> </span>string<span class="w"> </span><span class="o">(</span><span
class="nv">nullable</span><span class="w"> </span><span class="o">=</span><span
class="w"> </span><span class="nb">true</span><span class="o">)</span>
+<span class="w"> </span><span class="p">|</span><span class="w">
</span><span class="p">|</span>--<span class="w"> </span>lastName:<span
class="w"> </span>string<span class="w"> </span><span class="o">(</span><span
class="nv">nullable</span><span class="w"> </span><span class="o">=</span><span
class="w"> </span><span class="nb">true</span><span class="o">)</span>
+<span class="w"> </span><span class="p">|</span><span class="w">
</span><span class="p">|</span>--<span class="w"> </span>ageInYears:<span
class="w"> </span>integer<span class="w"> </span><span class="o">(</span><span
class="nv">nullable</span><span class="w"> </span><span class="o">=</span><span
class="w"> </span><span class="nb">true</span><span class="o">)</span>
+
+<span class="m">25</span>/01/30<span class="w"> </span><span
class="m">16</span>:50:43<span class="w"> </span>INFO<span class="w">
</span>core/src/lib.rs:<span class="w"> </span>Comet<span class="w">
</span>native<span class="w"> </span>library<span class="w">
</span>version<span class="w"> </span><span class="m">0</span>.6.0<span
class="w"> </span><span class="nv">initialized</span>
+<span class="o">==</span><span class="w"> </span>Physical<span class="w">
</span><span class="nv">Plan</span><span class="w"> </span><span
class="o">==</span>
+*<span class="w"> </span>CometColumnarToRow<span class="w"> </span><span
class="o">(</span><span class="m">2</span><span class="o">)</span>
++-<span class="w"> </span>CometNativeScan:<span class="w"> </span><span
class="o">(</span><span class="m">1</span><span class="o">)</span>
+
+
+<span class="o">(</span><span class="m">1</span><span class="o">)</span><span
class="w"> </span>CometNativeScan:<span class="w"> </span>
+Output<span class="w"> </span><span class="o">[</span><span
class="m">3</span><span class="o">]</span>:<span class="w"> </span><span
class="o">[</span>id#0,<span class="w"> </span>first_name#1,<span class="w">
</span>personal_info#4<span class="o">]</span>
+Arguments:<span class="w"> </span><span class="o">[</span>id#0,<span
class="w"> </span>first_name#1,<span class="w"> </span>personal_info#4<span
class="o">]</span>
+
+<span class="o">(</span><span class="m">2</span><span class="o">)</span><span
class="w"> </span>CometColumnarToRow<span class="w"> </span><span
class="o">[</span>codegen<span class="w"> </span>id<span class="w">
</span>:<span class="w"> </span><span class="m">1</span><span class="o">]</span>
+Input<span class="w"> </span><span class="o">[</span><span
class="m">3</span><span class="o">]</span>:<span class="w"> </span><span
class="o">[</span>id#0,<span class="w"> </span>first_name#1,<span class="w">
</span>personal_info#4<span class="o">]</span>
+
+
+<span class="m">25</span>/01/30<span class="w"> </span><span
class="m">16</span>:50:44<span class="w"> </span>INFO<span class="w">
</span>fs-hdfs-0.1.12/src/hdfs.rs:<span class="w"> </span>Connecting<span
class="w"> </span>to<span class="w"> </span>Namenode<span class="w">
</span><span class="o">(</span>hdfs://namenode:9000<span class="o">)</span>
++---+----------+-----------------+
+<span class="p">|</span>id<span class="w"> </span><span
class="p">|</span>first_name<span class="p">|</span>personal_info<span
class="w"> </span><span class="p">|</span>
++---+----------+-----------------+
+<span class="p">|</span><span class="m">2</span><span class="w"> </span><span
class="p">|</span>Jane<span class="w"> </span><span
class="p">|</span><span class="o">{</span>Jane,<span class="w">
</span>Smith,<span class="w"> </span><span class="m">34</span><span
class="o">}</span><span class="p">|</span>
+<span class="p">|</span><span class="m">1</span><span class="w"> </span><span
class="p">|</span>John<span class="w"> </span><span
class="p">|</span><span class="o">{</span>John,<span class="w">
</span>Doe,<span class="w"> </span><span class="m">28</span><span
class="o">}</span><span class="w"> </span><span class="p">|</span>
++---+----------+-----------------+
+</pre></div>
+</div>
+<p>Verify the native scan type should be <code class="docutils literal
notranslate"><span class="pre">CometNativeScan</span></code>.</p>
+<p>More on <span class="xref myst">HDFS Reader</span></p>
+</section>
+</section>
+<section id="s3">
+<h2>S3<a class="headerlink" href="#s3" title="Link to this heading">¶</a></h2>
+<p>In progress</p>
+</section>
</section>
diff --git a/user-guide/datatypes.html b/user-guide/datatypes.html
index bb85bd115..42e3f9ecf 100644
--- a/user-guide/datatypes.html
+++ b/user-guide/datatypes.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1 current active">
diff --git a/user-guide/expressions.html b/user-guide/expressions.html
index d0e87359b..49915ec32 100644
--- a/user-guide/expressions.html
+++ b/user-guide/expressions.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/installation.html b/user-guide/installation.html
index a0a201a3d..8aa544115 100644
--- a/user-guide/installation.html
+++ b/user-guide/installation.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/kubernetes.html b/user-guide/kubernetes.html
index 1aa83d855..6d22e80fe 100644
--- a/user-guide/kubernetes.html
+++ b/user-guide/kubernetes.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/metrics.html b/user-guide/metrics.html
index 532dd3627..8f3b30e17 100644
--- a/user-guide/metrics.html
+++ b/user-guide/metrics.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/operators.html b/user-guide/operators.html
index 7b5b60092..ee66c5790 100644
--- a/user-guide/operators.html
+++ b/user-guide/operators.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/overview.html b/user-guide/overview.html
index 7b273bf5f..eec0cdf8a 100644
--- a/user-guide/overview.html
+++ b/user-guide/overview.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/source.html b/user-guide/source.html
index c5a081c0b..ce7c7fe92 100644
--- a/user-guide/source.html
+++ b/user-guide/source.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
diff --git a/user-guide/tuning.html b/user-guide/tuning.html
index a5dec536c..d69ae235a 100644
--- a/user-guide/tuning.html
+++ b/user-guide/tuning.html
@@ -130,7 +130,7 @@ under the License.
</li>
<li class="toctree-l1">
<a class="reference internal" href="datasources.html">
- Supported Data Sources
+ Supported Spark Data Sources
</a>
</li>
<li class="toctree-l1">
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]