This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/datafusion.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new dc732de0c0 Publish built docs triggered by 
ec222096b9d750614a2e5a27950129b172ea11f4
dc732de0c0 is described below

commit dc732de0c0a0e5e0d8593bf8acf483baf569467b
Author: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
AuthorDate: Tue Mar 4 11:38:49 2025 +0000

    Publish built docs triggered by ec222096b9d750614a2e5a27950129b172ea11f4
---
 _sources/contributor-guide/howtos.md.txt           | 4 ++--
 _sources/library-user-guide/query-optimizer.md.txt | 4 ++--
 _sources/user-guide/configs.md.txt                 | 2 +-
 contributor-guide/howtos.html                      | 4 ++--
 user-guide/configs.html                            | 2 +-
 5 files changed, 8 insertions(+), 8 deletions(-)

diff --git a/_sources/contributor-guide/howtos.md.txt 
b/_sources/contributor-guide/howtos.md.txt
index 556242751f..89a1bc7360 100644
--- a/_sources/contributor-guide/howtos.md.txt
+++ b/_sources/contributor-guide/howtos.md.txt
@@ -141,9 +141,9 @@ taplo fmt
 
 ## How to update protobuf/gen dependencies
 
-The prost/tonic code can be generated by running `./regen.sh`, which in turn 
invokes the Rust binary located in [gen](./gen)
+The prost/tonic code can be generated by running `./regen.sh`, which in turn 
invokes the Rust binary located in `./gen`
 
-This is necessary after modifying the protobuf definitions or altering the 
dependencies of [gen](./gen), and requires a
+This is necessary after modifying the protobuf definitions or altering the 
dependencies of `./gen`, and requires a
 valid installation of [protoc] (see [installation instructions] for details).
 
 ```bash
diff --git a/_sources/library-user-guide/query-optimizer.md.txt 
b/_sources/library-user-guide/query-optimizer.md.txt
index 5883fae7b8..ed6f0dca80 100644
--- a/_sources/library-user-guide/query-optimizer.md.txt
+++ b/_sources/library-user-guide/query-optimizer.md.txt
@@ -359,7 +359,7 @@ interval arithmetic to take an expression such as `a > 2500 
AND a <= 5000` and
 build an accurate selectivity estimate that can then be used to find more 
efficient
 plans.
 
-#### `AnalysisContext` API
+### `AnalysisContext` API
 
 The `AnalysisContext` serves as a shared knowledge base during expression 
evaluation
 and boundary analysis. Think of it as a dynamic repository that maintains 
information about:
@@ -372,7 +372,7 @@ What makes `AnalysisContext` particularly powerful is its 
ability to propagate i
 through the expression tree. As each node in the expression tree is analyzed, 
it can both
 read from and write to this shared context, allowing for sophisticated 
boundary analysis and inference.
 
-#### `ColumnStatistics` for Cardinality Estimation
+### `ColumnStatistics` for Cardinality Estimation
 
 Column statistics form the foundation of optimization decisions. Rather than 
just tracking
 simple metrics, DataFusion's `ColumnStatistics` provides a rich set of 
information including:
diff --git a/_sources/user-guide/configs.md.txt 
b/_sources/user-guide/configs.md.txt
index 8c4aad5107..f29fbb6745 100644
--- a/_sources/user-guide/configs.md.txt
+++ b/_sources/user-guide/configs.md.txt
@@ -127,5 +127,5 @@ Environment variables are read during `SessionConfig` 
initialisation so they mus
 | datafusion.sql_parser.enable_options_value_normalization                | 
false                     | When set to true, SQL parser will normalize options 
value (convert value to lowercase). Note that this option is ignored and will 
be removed in the future. All case-insensitive values are normalized 
automatically.                                                                  
                                                                                
                              [...]
 | datafusion.sql_parser.dialect                                           | 
generic                   | Configure the SQL dialect used by DataFusion's 
parser; supported values include: Generic, MySQL, PostgreSQL, Hive, SQLite, 
Snowflake, Redshift, MsSQL, ClickHouse, BigQuery, Ansi, DuckDB and Databricks.  
                                                                                
                                                                                
                          [...]
 | datafusion.sql_parser.support_varchar_with_length                       | 
true                      | If true, permit lengths for `VARCHAR` such as 
`VARCHAR(20)`, but ignore the length. If false, error if a `VARCHAR` with a 
length is specified. The Arrow type system does not have a notion of maximum 
string length and thus DataFusion can not enforce such limits.                  
                                                                                
                              [...]
-| datafusion.sql_parser.collect_spans                                     | 
false                     | When set to true, the source locations relative to 
the original SQL query (i.e. [`Span`](sqlparser::tokenizer::Span)) will be 
collected and recorded in the logical plan nodes.                               
                                                                                
                                                                                
                       [...]
+| datafusion.sql_parser.collect_spans                                     | 
false                     | When set to true, the source locations relative to 
the original SQL query (i.e. 
[`Span`](https://docs.rs/sqlparser/latest/sqlparser/tokenizer/struct.Span.html))
 will be collected and recorded in the logical plan nodes.                      
                                                                                
                                                                     [...]
 | datafusion.sql_parser.recursion_limit                                   | 50 
                       | Specifies the recursion depth limit when parsing 
complex SQL Queries                                                             
                                                                                
                                                                                
                                                                                
                    [...]
diff --git a/contributor-guide/howtos.html b/contributor-guide/howtos.html
index ffbd73c662..a160245773 100644
--- a/contributor-guide/howtos.html
+++ b/contributor-guide/howtos.html
@@ -752,8 +752,8 @@ taplo<span class="w"> </span><span class="m">0</span>.9.0
 </section>
 <section id="how-to-update-protobuf-gen-dependencies">
 <h2>How to update protobuf/gen dependencies<a class="headerlink" 
href="#how-to-update-protobuf-gen-dependencies" title="Link to this 
heading">ΒΆ</a></h2>
-<p>The prost/tonic code can be generated by running <code class="docutils 
literal notranslate"><span class="pre">./regen.sh</span></code>, which in turn 
invokes the Rust binary located in <a class="reference internal" 
href="#./gen"><span class="xref myst">gen</span></a></p>
-<p>This is necessary after modifying the protobuf definitions or altering the 
dependencies of <a class="reference internal" href="#./gen"><span class="xref 
myst">gen</span></a>, and requires a
+<p>The prost/tonic code can be generated by running <code class="docutils 
literal notranslate"><span class="pre">./regen.sh</span></code>, which in turn 
invokes the Rust binary located in <code class="docutils literal 
notranslate"><span class="pre">./gen</span></code></p>
+<p>This is necessary after modifying the protobuf definitions or altering the 
dependencies of <code class="docutils literal notranslate"><span 
class="pre">./gen</span></code>, and requires a
 valid installation of <a class="reference external" 
href="https://github.com/protocolbuffers/protobuf#protocol-compiler-installation";>protoc</a>
 (see <a class="reference external" 
href="https://datafusion.apache.org/contributor-guide/getting_started.html#protoc-installation";>installation
 instructions</a> for details).</p>
 <div class="highlight-bash notranslate"><div 
class="highlight"><pre><span></span>./regen.sh
 </pre></div>
diff --git a/user-guide/configs.html b/user-guide/configs.html
index a33c2c25bc..c55916fdf0 100644
--- a/user-guide/configs.html
+++ b/user-guide/configs.html
@@ -948,7 +948,7 @@ Environment variables are read during <code class="docutils 
literal notranslate"
 </tr>
 <tr class="row-even"><td><p>datafusion.sql_parser.collect_spans</p></td>
 <td><p>false</p></td>
-<td><p>When set to true, the source locations relative to the original SQL 
query (i.e. <a class="reference internal" 
href="#sqlparser::tokenizer::Span"><span class="xref myst"><code 
class="docutils literal notranslate"><span 
class="pre">Span</span></code></span></a>) will be collected and recorded in 
the logical plan nodes.</p></td>
+<td><p>When set to true, the source locations relative to the original SQL 
query (i.e. <a class="reference external" 
href="https://docs.rs/sqlparser/latest/sqlparser/tokenizer/struct.Span.html";><code
 class="docutils literal notranslate"><span class="pre">Span</span></code></a>) 
will be collected and recorded in the logical plan nodes.</p></td>
 </tr>
 <tr class="row-odd"><td><p>datafusion.sql_parser.recursion_limit</p></td>
 <td><p>50</p></td>


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to