GitHub user caneGuy opened a pull request:
https://github.com/apache/spark/pull/21552
[SPARK-24544][SQL] Print actual failure cause when look up function failed
## What changes were proposed in this pull request?
When we operate as below:
`
0: jdbc:hive2://xxx/> create function funnel_analysis as
'com.xxx.hive.extend.udf.UapFunnelAnalysis';
`
`
0: jdbc:hive2://xxx/> select funnel_analysis(1,",",1,'');
Error: org.apache.spark.sql.AnalysisException: Undefined function:
'funnel_analysis'. This function is neither a registered temporary function nor
a permanent function registered in the database 'xxx'.; line 1 pos 7
(state=,code=0)
`
`
0: jdbc:hive2://xxx/> describe function funnel_analysis;
+-----------------------------------------------------------+--+
| function_desc |
+-----------------------------------------------------------+--+
| Function: mifi.funnel_analysis |
| Class: com.xxx.hive.extend.udf.UapFunnelAnalysis |
| Usage: N/A. |
+-----------------------------------------------------------+--+
`
We can see describe funtion will get right information,but when we actually
use this funtion,we will get an undefined exception.
Which is really misleading,the real cause is below:
`
No handler for Hive UDF
'com.xiaomi.mifi.hive.extend.udf.UapFunnelAnalysis':
java.lang.IllegalStateException: Should not be called directly;
at
org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.initialize(GenericUDTF.java:72)
at
org.apache.spark.sql.hive.HiveGenericUDTF.outputInspector$lzycompute(hiveUDFs.scala:204)
at
org.apache.spark.sql.hive.HiveGenericUDTF.outputInspector(hiveUDFs.scala:204)
at
org.apache.spark.sql.hive.HiveGenericUDTF.elementSchema$lzycompute(hiveUDFs.scala:212)
at
org.apache.spark.sql.hive.HiveGenericUDTF.elementSchema(hiveUDFs.scala:212)
`
This patch print the actual failure for quick debugging.
## How was this patch tested?
UT
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/caneGuy/spark zhoukang/print-warning
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/21552.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #21552
----
commit 15d018a1e72ddf6cd29d6942359b9e6bd5547f67
Author: zhoukang <zhoukang199191@...>
Date: 2018-06-13T10:23:01Z
[SPARK][SQL] Print actual failure cause when look up function failed
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]