This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new aedc273  [SPARK-37731][SQL][FOLLOWUP] Update generator function lookup 
and migration guide
aedc273 is described below

commit aedc273107fd3f8852c380192f463240423c2c25
Author: allisonwang-db <[email protected]>
AuthorDate: Tue Jan 25 11:10:51 2022 +0800

    [SPARK-37731][SQL][FOLLOWUP] Update generator function lookup and migration 
guide
    
    ### What changes were proposed in this pull request?
    This PR is a follow-up PR for SPARK-37731. It updates the Analyzer logic 
when resolving generator functions to match the behavior before SPARK-37731, 
and the migration docs to include another behavior change for dropping a 
persistent function that has the same name as one of the built-in functions.
    
    ### Why are the changes needed?
    Follow up for SPARK-37731.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    Existing tests
    
    Closes #35275 from allisonwang-db/spark-37731-follow-up.
    
    Authored-by: allisonwang-db <[email protected]>
    Signed-off-by: Wenchen Fan <[email protected]>
---
 docs/sql-migration-guide.md                                    |  2 ++
 .../org/apache/spark/sql/catalyst/analysis/Analyzer.scala      | 10 ++++++----
 .../org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala |  5 ++---
 3 files changed, 10 insertions(+), 7 deletions(-)

diff --git a/docs/sql-migration-guide.md b/docs/sql-migration-guide.md
index 01c828a..63fc51a 100644
--- a/docs/sql-migration-guide.md
+++ b/docs/sql-migration-guide.md
@@ -58,6 +58,8 @@ license: |
 
   - Since Spark 3.3, the table property `external` becomes reserved. Certain 
commands will fail if you specify the `external` property, such as `CREATE 
TABLE ... TBLPROPERTIES` and `ALTER TABLE ... SET TBLPROPERTIES`. In Spark 3.2 
and earlier, the table property `external` is silently ignored. You can set 
`spark.sql.legacy.notReserveProperties` to `true` to restore the old behavior.
 
+  - Since Spark 3.3, DROP FUNCTION fails if the function name matches one of 
the built-in functions' name and is not qualified. In Spark 3.2 or earlier, 
DROP FUNCTION can still drop a persistent function even if the name is not 
qualified and is the same as a built-in function's name.
+
 ## Upgrading from Spark SQL 3.1 to 3.2
 
   - Since Spark 3.2, ADD FILE/JAR/ARCHIVE commands require each path to be 
enclosed by `"` or `'` if the path contains whitespaces.
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
index 103a445..d31f90a 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
@@ -2078,10 +2078,12 @@ class Analyzer(override val catalogManager: 
CatalogManager)
           case u if !u.childrenResolved => u // Skip until children are 
resolved.
 
           case u @ UnresolvedGenerator(name, arguments) => withPosition(u) {
-            resolveBuiltinOrTempFunction(name.asMultipart, arguments, 
None).getOrElse {
-              // For generator function, the parser only accepts v1 function 
name and creates
-              // `FunctionIdentifier`.
-              v1SessionCatalog.resolvePersistentFunction(name, arguments)
+            // For generator function, the parser only accepts v1 function 
name and creates
+            // `FunctionIdentifier`.
+            v1SessionCatalog.lookupFunction(name, arguments) match {
+              case generator: Generator => generator
+              case other => throw 
QueryCompilationErrors.generatorNotExpectedError(
+                name, other.getClass.getCanonicalName)
             }
           }
 
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
index c712d2c..ad007f1 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
@@ -1715,9 +1715,8 @@ class SessionCatalog(
     }
   }
 
-  // Test only. The actual function lookup logic looks up temp/built-in 
function first, then
-  // persistent function from either v1 or v2 catalog. This method only look 
up v1 catalog and is
-  // no longer valid.
+  // The actual function lookup logic looks up temp/built-in function first, 
then persistent
+  // function from either v1 or v2 catalog. This method only look up v1 
catalog.
   def lookupFunction(name: FunctionIdentifier, children: Seq[Expression]): 
Expression = {
     if (name.database.isEmpty) {
       resolveBuiltinOrTempFunction(name.funcName, children)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to