Gengliang Wang created SPARK-26915:
--------------------------------------

             Summary: File source should write without schema inference and 
validation in DataFrameWriter.save()
                 Key: SPARK-26915
                 URL: https://issues.apache.org/jira/browse/SPARK-26915
             Project: Spark
          Issue Type: Task
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Gengliang Wang


Spark supports writing to file data sources without getting and validation with 
the table schema.
For example, 
```
spark.range(10).write.orc(path)
val newDF = spark.range(20).map(id => (id.toDouble, 
id.toString)).toDF("double", "string")
newDF.write.mode("overwrite").orc(path)
```
1. There is no need to get/infer the schema from the table/path
2. The schema of `newDF` can be different with the original table schema.

However, from https://github.com/apache/spark/pull/23606/files#r255319992 we 
can see that the feature above is missing in data source V2. Currently, data 
source V2 always validates the output query with the table schema. Even after 
the catalog support of DS V2 is implemented, I think it is hard to support both 
behaviors with the current API/framework.

This PR proposes to process file sources as a special case in 
`DataFrameWriter.save()`. So that we can keep the original behavior for this 
DataFrame API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to