[ 
https://issues.apache.org/jira/browse/SPARK-44743?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan resolved SPARK-44743.
---------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

Issue resolved by pull request 42661
[https://github.com/apache/spark/pull/42661]

> Reflect function behavior different from Hive
> ---------------------------------------------
>
>                 Key: SPARK-44743
>                 URL: https://issues.apache.org/jira/browse/SPARK-44743
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark, SQL
>    Affects Versions: 3.4.1
>            Reporter: Nikhil Goyal
>            Priority: Major
>             Fix For: 4.0.0
>
>
> Spark reflect function will fail if underlying method call throws exception. 
> This causes the whole job to fail.
> In Hive however the exception is caught and null is returned. Simple test to 
> reproduce the behavior
> {code:java}
> select reflect('java.net.URLDecoder', 'decode', '%') {code}
> The workaround would be to wrap this call in a try
> [https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala#L136]
> We can support this by adding a new UDF `try_reflect` which mimics the Hive's 
> behavior. Please share your thoughts on this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to