Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20756#discussion_r176956651
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 ---
    @@ -1261,8 +1261,39 @@ case class InitializeJavaBean(beanInstance: 
Expression, setters: Map[String, Exp
       override def children: Seq[Expression] = beanInstance +: 
setters.values.toSeq
       override def dataType: DataType = beanInstance.dataType
     
    -  override def eval(input: InternalRow): Any =
    -    throw new UnsupportedOperationException("Only code-generated 
evaluation is supported.")
    +  private lazy val resolvedSetters = {
    +    val ObjectType(beanClass) = beanInstance.dataType
    +    setters.map {
    +      case (name, expr) =>
    +        // Looking for known type mapping first, then using Class attached 
in `ObjectType`.
    +        // Finally also looking for general `Object`-type parameter for 
generic methods.
    +        val paramTypes = 
CallMethodViaReflection.typeMapping.getOrElse(expr.dataType,
    --- End diff --
    
    Sorry for not coming back to this sooner. AFAIK `CallMethodViaReflection` 
expression was only designed to work with a couple of primitives. I think we 
are looking for something a little bit more complete here, i.e. support all 
types in Spark SQL's type system. I also don't think that we should put the 
mappings in `CallMethodViaReflection` because the mapping is now using in more 
expressions, `ScalaReflection` is IMO a better place for this logic.
    
    And finally which PR will implement this. cc @maropu for visibility.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to