Github user dilipbiswal commented on a diff in the pull request:
https://github.com/apache/spark/pull/20579#discussion_r175019613
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala
---
@@ -542,6 +542,11 @@ case class DataSource(
throw new AnalysisException("Cannot save interval data type into
external storage.")
}
+ if (data.schema.size == 0) {
--- End diff --
@gatorsmile May i request you to please quickly go through Wenchen's and
Ryan's comments above ? My understanding is that , we want to consistently
rejecting writing empty schema for all the data sources ? Please let me know.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]