cloud-fan commented on a change in pull request #25085: [SPARK-28313][SQL]
Spark sql null type incompatible with hive void type
URL: https://github.com/apache/spark/pull/25085#discussion_r324999861
##########
File path:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
##########
@@ -961,7 +961,11 @@ private[hive] object HiveClientImpl {
/** Get the Spark SQL native DataType from Hive's FieldSchema. */
private def getSparkSQLDataType(hc: FieldSchema): DataType = {
try {
- CatalystSqlParser.parseDataType(hc.getType)
+ hc.getType match {
+ // SPARK-28313 compatible hive void type
+ case "void" => NullType
Review comment:
I believe this is the fix we need. But I'm curious about when can this
happen if hive forbids defining void type columns.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]