[ 
https://issues.apache.org/jira/browse/SPARK-30098?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan updated SPARK-30098:
--------------------------------
    Description: 
Changing the default provider from `hive` to the value of 
`spark.sql.sources.default` for "CREATE TABLE" command to make it be consistent 
with DataFrameWriter.saveAsTable API, w.r.t. to the new cofig. (by default we 
don't change the table provider)

Also, it brings more friendly to end users since Spark is well know of using 
parquet(default value of `spark.sql.sources.default`) as its default I/O format.

  was:
Changing the default provider from `hive` to the value of 
`spark.sql.sources.default` for "CREATE TABLE" command to make it be consistent 
with DataFrameWriter.saveAsTable API.

Also, it brings more friendly to end users since Spark is well know of using 
parquet(default value of `spark.sql.sources.default`) as its default I/O format.


> Add a configuration to use default datasource as provider for CREATE TABLE 
> command
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-30098
>                 URL: https://issues.apache.org/jira/browse/SPARK-30098
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: wuyi
>            Assignee: Wenchen Fan
>            Priority: Major
>             Fix For: 3.1.0
>
>
> Changing the default provider from `hive` to the value of 
> `spark.sql.sources.default` for "CREATE TABLE" command to make it be 
> consistent with DataFrameWriter.saveAsTable API, w.r.t. to the new cofig. (by 
> default we don't change the table provider)
> Also, it brings more friendly to end users since Spark is well know of using 
> parquet(default value of `spark.sql.sources.default`) as its default I/O 
> format.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to