This is an automated email from the ASF dual-hosted git repository.
github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion-comet.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 38e58623c Publish built docs triggered by
e17bce3437825446b8b33235437699deb7f2af36
38e58623c is described below
commit 38e58623c5592ef2efc7844a3dd890841652f9e5
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Mon Sep 8 22:19:08 2025 +0000
Publish built docs triggered by e17bce3437825446b8b33235437699deb7f2af36
---
_sources/user-guide/latest/datasources.md.txt | 8 ++++++++
searchindex.js | 2 +-
user-guide/latest/datasources.html | 13 +++++++++++++
3 files changed, 22 insertions(+), 1 deletion(-)
diff --git a/_sources/user-guide/latest/datasources.md.txt
b/_sources/user-guide/latest/datasources.md.txt
index 0dbc86827..3a469ba30 100644
--- a/_sources/user-guide/latest/datasources.md.txt
+++ b/_sources/user-guide/latest/datasources.md.txt
@@ -175,6 +175,14 @@ The `native_datafusion` and `native_iceberg_compat`
Parquet scan implementations
This implementation maintains compatibility with existing Hadoop S3A
configurations, so existing code will continue to work as long as the
configurations are supported and can be translated without loss of
functionality.
+#### Root CA Certificates
+
+One major difference between `native_comet` and the other scan implementations
is the mechanism for discovering Root
+CA Certificates. The `native_comet` scan uses the JVM to read CA Certificates
from the Java Trust Store, but the native
+scan implementations `native_datafusion` and `native_iceberg_compat` use
system Root CA Certificates (typically stored
+in `/etc/ssl/certs` on Linux). These scans will not be able to interact with
S3 if the Root CA Certificates are not
+installed.
+
#### Supported Credential Providers
AWS credential providers can be configured using the
`fs.s3a.aws.credentials.provider` configuration. The following table shows the
supported credential providers and their configuration options:
diff --git a/searchindex.js b/searchindex.js
index f86cc0b3c..921dede3b 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({"alltitles": {"1. Install Comet": [[12, "install-comet"]],
"2. Clone Spark and Apply Diff": [[12, "clone-spark-and-apply-diff"]], "3. Run
Spark SQL Tests": [[12, "run-spark-sql-tests"]], "ANSI Mode": [[43,
"ansi-mode"]], "ANSI mode": [[17, "ansi-mode"], [30, "ansi-mode"]], "API
Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "Accelerating Apache Iceberg
Parquet Scans using Comet (Experimental)": [[22, null], [35, null], [48,
null]], [...]
\ No newline at end of file
+Search.setIndex({"alltitles": {"1. Install Comet": [[12, "install-comet"]],
"2. Clone Spark and Apply Diff": [[12, "clone-spark-and-apply-diff"]], "3. Run
Spark SQL Tests": [[12, "run-spark-sql-tests"]], "ANSI Mode": [[43,
"ansi-mode"]], "ANSI mode": [[17, "ansi-mode"], [30, "ansi-mode"]], "API
Differences Between Spark Versions": [[0,
"api-differences-between-spark-versions"]], "Accelerating Apache Iceberg
Parquet Scans using Comet (Experimental)": [[22, null], [35, null], [48,
null]], [...]
\ No newline at end of file
diff --git a/user-guide/latest/datasources.html
b/user-guide/latest/datasources.html
index 59c304567..63dbae98d 100644
--- a/user-guide/latest/datasources.html
+++ b/user-guide/latest/datasources.html
@@ -559,6 +559,11 @@ under the License.
</code>
</a>
<ul class="nav section-nav flex-column">
+ <li class="toc-h4 nav-item toc-entry">
+ <a class="reference internal nav-link" href="#root-ca-certificates">
+ Root CA Certificates
+ </a>
+ </li>
<li class="toc-h4 nav-item toc-entry">
<a class="reference internal nav-link"
href="#supported-credential-providers">
Supported Credential Providers
@@ -778,6 +783,14 @@ Input<span class="w"> </span><span class="o">[</span><span
class="m">3</span><sp
<h3><code class="docutils literal notranslate"><span
class="pre">native_datafusion</span></code> and <code class="docutils literal
notranslate"><span class="pre">native_iceberg_compat</span></code><a
class="headerlink" href="#native-datafusion-and-native-iceberg-compat"
title="Link to this heading">¶</a></h3>
<p>The <code class="docutils literal notranslate"><span
class="pre">native_datafusion</span></code> and <code class="docutils literal
notranslate"><span class="pre">native_iceberg_compat</span></code> Parquet scan
implementations completely offload data loading to native code. They use the <a
class="reference external" href="https://crates.io/crates/object_store"><code
class="docutils literal notranslate"><span
class="pre">object_store</span></code> crate</a> to read data from S3 and sup
[...]
<p>This implementation maintains compatibility with existing Hadoop S3A
configurations, so existing code will continue to work as long as the
configurations are supported and can be translated without loss of
functionality.</p>
+<section id="root-ca-certificates">
+<h4>Root CA Certificates<a class="headerlink" href="#root-ca-certificates"
title="Link to this heading">¶</a></h4>
+<p>One major difference between <code class="docutils literal
notranslate"><span class="pre">native_comet</span></code> and the other scan
implementations is the mechanism for discovering Root
+CA Certificates. The <code class="docutils literal notranslate"><span
class="pre">native_comet</span></code> scan uses the JVM to read CA
Certificates from the Java Trust Store, but the native
+scan implementations <code class="docutils literal notranslate"><span
class="pre">native_datafusion</span></code> and <code class="docutils literal
notranslate"><span class="pre">native_iceberg_compat</span></code> use system
Root CA Certificates (typically stored
+in <code class="docutils literal notranslate"><span
class="pre">/etc/ssl/certs</span></code> on Linux). These scans will not be
able to interact with S3 if the Root CA Certificates are not
+installed.</p>
+</section>
<section id="supported-credential-providers">
<h4>Supported Credential Providers<a class="headerlink"
href="#supported-credential-providers" title="Link to this heading">¶</a></h4>
<p>AWS credential providers can be configured using the <code class="docutils
literal notranslate"><span
class="pre">fs.s3a.aws.credentials.provider</span></code> configuration. The
following table shows the supported credential providers and their
configuration options:</p>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]