Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/19747#discussion_r152734937
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
---
@@ -895,6 +898,19 @@ private[hive] object HiveClientImpl {
Option(hc.getComment).map(field.withComment).getOrElse(field)
}
+ private def verifyColumnDataType(schema: StructType): Unit = {
+ schema.foreach(field => {
+ val typeString = field.dataType.catalogString
--- End diff --
`catalogString` is generated by Spark. It is not related to the restriction
of Hive.
See my fix:
https://github.com/gatorsmile/spark/commit/bdcb9c8d29db022d9703eb91ef3f74c35bc24ec1
After you applying my fix, you also need to update the test cases to make
the exception types consistent.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]