Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/15257#discussion_r80663732
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala
---
@@ -52,13 +53,51 @@ object Literal {
case t: Timestamp => Literal(DateTimeUtils.fromJavaTimestamp(t),
TimestampType)
case d: Date => Literal(DateTimeUtils.fromJavaDate(d), DateType)
case a: Array[Byte] => Literal(a, BinaryType)
+ case a: Array[_] =>
+ val elementType =
componentTypeToDataType(a.getClass.getComponentType())
+ val dataType = ArrayType(elementType)
+ val convert =
CatalystTypeConverters.createToCatalystConverter(dataType)
+ Literal(convert(a), dataType)
case i: CalendarInterval => Literal(i, CalendarIntervalType)
case null => Literal(null, NullType)
case v: Literal => v
case _ =>
throw new RuntimeException("Unsupported literal type " + v.getClass
+ " " + v)
}
+ private def componentTypeToDataType(clz: Class[_]): DataType = clz match
{
--- End diff --
It looks so similar to the other cases where Spark has to map from Scala
types to Spark SQL's, like
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala#L65.
When comparing them I noticed that `CalendarInterval` is not included in your
list. Why?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]