Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18544#discussion_r202259987
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala ---
    @@ -129,14 +129,14 @@ private[sql] class HiveSessionCatalog(
         Try(super.lookupFunction(funcName, children)) match {
           case Success(expr) => expr
           case Failure(error) =>
    -        if (functionRegistry.functionExists(funcName)) {
    -          // If the function actually exists in functionRegistry, it means 
that there is an
    -          // error when we create the Expression using the given children.
    +        if (super.functionExists(name)) {
    +          // If the function actually exists in functionRegistry or 
externalCatalog,
    +          // it means that there is an error when we create the Expression 
using the given children.
               // We need to throw the original exception.
               throw error
             } else {
    -          // This function is not in functionRegistry, let's try to load 
it as a Hive's
    -          // built-in function.
    +          // This function is not in functionRegistry or externalCatalog,
    +          // let's try to load it as a Hive's built-in function.
               // Hive is case insensitive.
               val functionName = 
funcName.unquotedString.toLowerCase(Locale.ROOT)
               if (!hiveFunctions.contains(functionName)) {
    --- End diff --
    
    We do not need to change the other parts. We just need to throw the 
exception in `failFunctionLookup(funcName)`, right?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to