Github user mt40 commented on a diff in the pull request:
https://github.com/apache/spark/pull/22309#discussion_r234085471
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala
---
@@ -373,6 +383,32 @@ object ScalaReflection extends ScalaReflection {
dataType = ObjectType(udt.getClass))
Invoke(obj, "deserialize", ObjectType(udt.userClass), path :: Nil)
+ case t if isValueClass(t) =>
+ val (_, underlyingType) = getUnderlyingParameterOf(t)
+ val underlyingClsName = getClassNameFromType(underlyingType)
+ val clsName = getUnerasedClassNameFromType(t)
+ val newTypePath = s"""- Scala value class:
$clsName($underlyingClsName)""" +:
+ walkedTypePath
+
+ // Nested value class is treated as its underlying type
+ // because the compiler will convert value class in the schema to
+ // its underlying type.
+ // However, for value class that is top-level or array element,
+ // if it is used as another type (e.g. as its parent trait or
generic),
+ // the compiler keeps the class so we must provide an instance of
the
+ // class too. In other cases, the compiler will handle
wrapping/unwrapping
+ // for us automatically.
+ val arg = deserializerFor(underlyingType, path, newTypePath,
Some(t))
+ val isCollectionElement = lastType.exists { lt =>
+ lt <:< localTypeOf[Array[_]] || lt <:< localTypeOf[Seq[_]]
--- End diff --
I added the support for Map
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]