This is an automated email from the ASF dual-hosted git repository. yamamuro pushed a commit to branch branch-3.0 in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push: new cf14897 [SPARK-32677][SQL][DOCS][MINOR] Improve code comment in CreateFunctionCommand cf14897 is described below commit cf14897d355efbf4acb3497ef1b74cd3a9c35d59 Author: Wenchen Fan <wenc...@databricks.com> AuthorDate: Fri Sep 11 09:22:56 2020 +0900 [SPARK-32677][SQL][DOCS][MINOR] Improve code comment in CreateFunctionCommand ### What changes were proposed in this pull request? We made a mistake in https://github.com/apache/spark/pull/29502, as there is no code comment to explain why we can't load the UDF class when creating functions. This PR improves the code comment. ### Why are the changes needed? To avoid making the same mistake. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? N/A Closes #29713 from cloud-fan/comment. Authored-by: Wenchen Fan <wenc...@databricks.com> Signed-off-by: Takeshi Yamamuro <yamam...@apache.org> (cherry picked from commit 328d81a2d1131742bcfba5117896c093db39e721) Signed-off-by: Takeshi Yamamuro <yamam...@apache.org> --- .../main/scala/org/apache/spark/sql/execution/command/functions.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/functions.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/functions.scala index 6fdc7f4..d55d696 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/functions.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/functions.scala @@ -88,7 +88,9 @@ case class CreateFunctionCommand( } else { // For a permanent, we will store the metadata into underlying external catalog. // This function will be loaded into the FunctionRegistry when a query uses it. - // We do not load it into FunctionRegistry right now. + // We do not load it into FunctionRegistry right now, to avoid loading the resource and + // UDF class immediately, as the Spark application to create the function may not have + // access to the resource and/or UDF class. catalog.createFunction(func, ignoreIfExists) } } --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org