[
https://issues.apache.org/jira/browse/SPARK-36680?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
wangzhun updated SPARK-36680:
-----------------------------
Description:
Now a DataFrame API user can implement dynamic options through the
_DataFrameReader$option_ method, but Spark SQL users cannot use.
{code:java}
DataFrameReader/AstBuilder -> UnresolvedRelation$options ->
DataSourceV2Relation$options -> SupportsRead$newScanBuilder(options)
{code}
The table options were persisted to the Catalog and if we want to modify that,
we should use another DDL like "_ALTER TABLE ..._". But there are some cases
that user want to modify the table options dynamically just in the query:
* JDBCTable set _fetchsize_ according to the actual situation of the table
* IcebergTable support time travel
{code:java}
spark.read
.option("snapshot-id", 10963874102873L)
.format("iceberg")
.load("path/to/table"){code}
These parameters setting is very common and ad-hoc, setting them flexibly would
promote the user experience with Spark SQL especially for Now we support
catalog expansion.
was:
Now a DataFrame API user can implement dynamic options through the
_DataFrameReader$option_ method, but Spark SQL users cannot use.
{code:java}
DataFrameReader/AstBuilder -> UnresolvedRelation$options ->
DataSourceV2Relation$options -> SupportsRead$newScanBuilder(options)
{code}
The table options were persisted to the Catalog and if we want to modify that,
we should use another DDL like "_ALTER TABLE ..._". But there are some cases
that user want to modify the table options dynamically just in the query: *
JDBCTable set _fetchsize_ according to the actual situation of the table
*
IcebergTable support time travel
{code:java}
spark.read
.option("snapshot-id", 10963874102873L)
.format("iceberg")
.load("path/to/table"){code}
These parameters setting is very common and ad-hoc, setting them flexibly would
promote the user experience with Spark SQL especially for Now we support
catalog expansion.
> Supports Dynamic Table Options for Spark SQL
> --------------------------------------------
>
> Key: SPARK-36680
> URL: https://issues.apache.org/jira/browse/SPARK-36680
> Project: Spark
> Issue Type: New Feature
> Components: SQL
> Affects Versions: 3.1.2
> Reporter: wangzhun
> Priority: Major
>
> Now a DataFrame API user can implement dynamic options through the
> _DataFrameReader$option_ method, but Spark SQL users cannot use.
> {code:java}
> DataFrameReader/AstBuilder -> UnresolvedRelation$options ->
> DataSourceV2Relation$options -> SupportsRead$newScanBuilder(options)
> {code}
>
> The table options were persisted to the Catalog and if we want to modify
> that, we should use another DDL like "_ALTER TABLE ..._". But there are some
> cases that user want to modify the table options dynamically just in the
> query:
> * JDBCTable set _fetchsize_ according to the actual situation of the table
> * IcebergTable support time travel
> {code:java}
> spark.read
> .option("snapshot-id", 10963874102873L)
> .format("iceberg")
> .load("path/to/table"){code}
> These parameters setting is very common and ad-hoc, setting them flexibly
> would promote the user experience with Spark SQL especially for Now we
> support catalog expansion.
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]