gbloisi-openaire commented on code in PR #42634:
URL: https://github.com/apache/spark/pull/42634#discussion_r1315180662


##########
sql/api/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala:
##########
@@ -156,4 +158,17 @@ object JavaTypeInference {
       .filterNot(_.getName == "declaringClass")
       .filter(_.getReadMethod != null)
   }
+
+  @tailrec
+  def getClassHierarchyTypeArguments(cls: Class[_],

Review Comment:
   At a second thought  this recursive getClassHierarchyTypeArguments is not 
required as JavaTypeUtils.getTypeArguments(cls, classOf[Object]) already 
traverses the inheritance hierarchy till Object base class collecting type 
variables. So I just removed it.
   To be clear what we are doing here is to collect type variable information 
about base class (which in turn could be derived from another base class, 
forming the hierarchy the requires to be traversed).
   This is required for both provided class and nested beans that extend one or 
more generic base classes in their inheritance hierarchy (tests provided).
   Interfaces are not required to be processed because they  declare abstract 
methods only, so the definition of the method (and its actual return type)  
will come from the class implementing the interface.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to