Github user gengliangwang commented on a diff in the pull request:
https://github.com/apache/spark/pull/21389#discussion_r189872259
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/json/JsonUtils.scala
---
@@ -48,4 +49,33 @@ object JsonUtils {
json.sample(withReplacement = false, options.samplingRatio, 1)
}
}
+
+ /**
+ * Verify if the schema is supported in JSON datasource.
+ */
+ def verifySchema(schema: StructType): Unit = {
--- End diff --
The function `verifySchema` is very similar with the one in Orc/Parquet
except the exception message. Should we put it into a util object?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]