[ 
https://issues.apache.org/jira/browse/SPARK-20918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan updated SPARK-20918:
--------------------------------
    Description: 
Currently, the unquoted string of a function identifier is being used as the 
function identifier in the function registry. This could cause the incorrect 
the behavior when users use `.` in the function names. 

As an example, Spark can resolve a function like this
{code}
SELECT `d100.udf100`(`emp`.`name`) FROM `emp`;
{code}

Although the function name is wrapped with backticks, Spark still resolves it 
as database name + function name, which is wrong.

  was:Currently, the unquoted string of a function identifier is being used as 
the function identifier in the function registry. This could cause the 
incorrect the behavior when users use `.` in the function names. 


> Use FunctionIdentifier as function identifiers in FunctionRegistry
> ------------------------------------------------------------------
>
>                 Key: SPARK-20918
>                 URL: https://issues.apache.org/jira/browse/SPARK-20918
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Xiao Li
>            Assignee: Xiao Li
>            Priority: Major
>             Fix For: 2.3.0
>
>
> Currently, the unquoted string of a function identifier is being used as the 
> function identifier in the function registry. This could cause the incorrect 
> the behavior when users use `.` in the function names. 
> As an example, Spark can resolve a function like this
> {code}
> SELECT `d100.udf100`(`emp`.`name`) FROM `emp`;
> {code}
> Although the function name is wrapped with backticks, Spark still resolves it 
> as database name + function name, which is wrong.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to