Github user dilipbiswal commented on a diff in the pull request:
https://github.com/apache/spark/pull/20579#discussion_r176232893
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala
---
@@ -719,4 +720,27 @@ object DataSource extends Logging {
}
globPath
}
+
+ /**
+ * Called before writing into a FileFormat based data source to make
sure the
+ * supplied schema is not empty.
+ * @param schema
+ */
+ private def hasEmptySchema(schema: StructType): Unit = {
+ def hasEmptySchemaInternal(schema: StructType): Boolean = {
--- End diff --
@cloud-fan I have gone ahead and changed the top level function name to
validateSchema. I have kept the internal function name to be hasEmptySchema.
Hopefully it makes sense now.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]