Nikhil Goyal created SPARK-44743:
------------------------------------

             Summary: Reflect function behavior different from Hive
                 Key: SPARK-44743
                 URL: https://issues.apache.org/jira/browse/SPARK-44743
             Project: Spark
          Issue Type: New Feature
          Components: PySpark, SQL
    Affects Versions: 3.4.1
            Reporter: Nikhil Goyal


Spark reflect function will fail if underlying method call throws exception. 
This causes the whole job to fail.

In Hive however the exception is caught and null is returned. Simple test to 
reproduce the behavior
{code:java}
select reflect('java.net.URLDecoder', 'decode', '%') {code}
The workaround would be to wrap this call in a try
[https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala#L136]


We can support this by adding a new UDF `try_reflect` which mimics the Hive's 
behavior. Please share your thoughts on this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to