gengliangwang opened a new pull request #23829: [SPARK-26915][SQL]File source should write without schema validation in DataFrameWriter.save() URL: https://github.com/apache/spark/pull/23829 ## What changes were proposed in this pull request? Spark supports writing to file data sources without getting and validation with the table schema. For example, ``` spark.range(10).write.orc(path) val newDF = spark.range(20).map(id => (id.toDouble, id.toString)).toDF("double", "string") newDF.write.mode("overwrite").orc(path) ``` 1. There is no need to get/infer the schema from the table/path 2. The schema of `newDF` can be different with the original table schema. However, from https://github.com/apache/spark/pull/23606/files#r255319992 we can see that the feature above is missing in data source V2. Currently, data source V2 always validates the output query with the table schema. Even after the catalog support of DS V2 is implemented, I think it is hard to support both behaviors with the current API/framework. This PR proposes to process file sources as a special case in `DataFrameWriter.save()`. So that we can keep the original behavior for this `DataFrame` API. The PR also reeanbles write path of Orc data source V2. ## How was this patch tested? Unit test
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
