Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/8800#discussion_r39882820
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUDFs.scala 
---
    @@ -59,19 +59,52 @@ private[hive] class HiveFunctionRegistry(underlying: 
analysis.FunctionRegistry)
     
           val functionClassName = functionInfo.getFunctionClass.getName
     
    -      if (classOf[UDF].isAssignableFrom(functionInfo.getFunctionClass)) {
    -        HiveSimpleUDF(new HiveFunctionWrapper(functionClassName), children)
    -      } else if 
(classOf[GenericUDF].isAssignableFrom(functionInfo.getFunctionClass)) {
    -        HiveGenericUDF(new HiveFunctionWrapper(functionClassName), 
children)
    -      } else if (
    -        
classOf[AbstractGenericUDAFResolver].isAssignableFrom(functionInfo.getFunctionClass))
 {
    -        HiveGenericUDAF(new HiveFunctionWrapper(functionClassName), 
children)
    -      } else if 
(classOf[UDAF].isAssignableFrom(functionInfo.getFunctionClass)) {
    -        HiveUDAF(new HiveFunctionWrapper(functionClassName), children)
    -      } else if 
(classOf[GenericUDTF].isAssignableFrom(functionInfo.getFunctionClass)) {
    -        HiveGenericUDTF(new HiveFunctionWrapper(functionClassName), 
children)
    -      } else {
    -        sys.error(s"No handler for udf ${functionInfo.getFunctionClass}")
    +      try {
    +        // Based on the class of the function, we create corresponding the 
Hive function.
    +        // To trigger any error like arguments are not valid, we call the 
dataType method
    +        // to make sure internal Hive function resolver gets created 
without any issue.
    +        // The only exception at here is GenericUDTF, because it does not 
support dataType
    +        // method, we call elementTypes instead.
    +        val resolvedHiveFunction =
    +          if 
(classOf[UDF].isAssignableFrom(functionInfo.getFunctionClass)) {
    +            val func = HiveSimpleUDF(new 
HiveFunctionWrapper(functionClassName), children)
    +            // We actually only need func.dataType. Assigning the returned 
value to a val
    +            // is just to make the compiler happy.
    +            val notUsed = func.dataType
    --- End diff --
    
    hmm, it is not really safe to call `dataType` to trigger the evaluation of 
those lazy vals.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to