[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

2015-04-07 Thread Cheng Lian (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cheng Lian updated SPARK-4811:
--
Affects Version/s: 1.2.1
   1.3.0

 Custom UDTFs not working in Spark SQL
 -

 Key: SPARK-4811
 URL: https://issues.apache.org/jira/browse/SPARK-4811
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.1.0, 1.1.1, 1.2.1, 1.3.0
Reporter: Saurabh Santhosh
Priority: Critical

 I am using the Thrift srever interface to Spark SQL and using beeline to 
 connect to it.
 I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
 following exception when using any custom UDTF.
 These are the steps i did :
 *Created a UDTF 'com.x.y.xxx'.*
 Registered the UDTF using following query : 
 *create temporary function xxx as 'com.x.y.xxx'*
 The registration went through without any errors. But when i tried executing 
 the UDTF i got the following error.
 *java.lang.ClassNotFoundException: xxx*
 Funny thing is that Its trying to load the function name instead of the 
 funtion class. The exception is at *line no: 81 in hiveudfs.scala*
 I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

2015-02-02 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-4811:

Target Version/s: 1.4.0  (was: 1.3.0)

 Custom UDTFs not working in Spark SQL
 -

 Key: SPARK-4811
 URL: https://issues.apache.org/jira/browse/SPARK-4811
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.1.0, 1.1.1
Reporter: Saurabh Santhosh
Priority: Critical

 I am using the Thrift srever interface to Spark SQL and using beeline to 
 connect to it.
 I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
 following exception when using any custom UDTF.
 These are the steps i did :
 *Created a UDTF 'com.x.y.xxx'.*
 Registered the UDTF using following query : 
 *create temporary function xxx as 'com.x.y.xxx'*
 The registration went through without any errors. But when i tried executing 
 the UDTF i got the following error.
 *java.lang.ClassNotFoundException: xxx*
 Funny thing is that Its trying to load the function name instead of the 
 funtion class. The exception is at *line no: 81 in hiveudfs.scala*
 I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

2014-12-19 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-4811:

Target Version/s: 1.3.0  (was: 1.2.0)

 Custom UDTFs not working in Spark SQL
 -

 Key: SPARK-4811
 URL: https://issues.apache.org/jira/browse/SPARK-4811
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.1.0, 1.1.1
Reporter: Saurabh Santhosh
Priority: Critical

 I am using the Thrift srever interface to Spark SQL and using beeline to 
 connect to it.
 I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
 following exception when using any custom UDTF.
 These are the steps i did :
 *Created a UDTF 'com.x.y.xxx'.*
 Registered the UDTF using following query : 
 *create temporary function xxx as 'com.x.y.xxx'*
 The registration went through without any errors. But when i tried executing 
 the UDTF i got the following error.
 *java.lang.ClassNotFoundException: xxx*
 Funny thing is that Its trying to load the function name instead of the 
 funtion class. The exception is at *line no: 81 in hiveudfs.scala*
 I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

2014-12-10 Thread Saurabh Santhosh (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saurabh Santhosh updated SPARK-4811:

Priority: Critical  (was: Major)

 Custom UDTFs not working in Spark SQL
 -

 Key: SPARK-4811
 URL: https://issues.apache.org/jira/browse/SPARK-4811
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.1.0, 1.1.1
Reporter: Saurabh Santhosh
Priority: Critical
 Fix For: 1.2.0


 I am using the Thrift srever interface to Spark SQL and using beeline to 
 connect to it.
 I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
 following exception when using any custom UDTF.
 These are the steps i did :
 *Created a UDTF 'com.x.y.xxx'.*
 Registered the UDTF using following query : 
 *create temporary function xxx as 'com.x.y.xxx'*
 The registration went through without any errors. But when i tried executing 
 the UDTF i got the following error.
 *java.lang.ClassNotFoundException: xxx*
 Funny thing is that Its trying to load the function name instead of the 
 funtion class. The exception is at *line no: 81 in hiveudfs.scala*
 I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

2014-12-10 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-4811:

Fix Version/s: (was: 1.2.0)

 Custom UDTFs not working in Spark SQL
 -

 Key: SPARK-4811
 URL: https://issues.apache.org/jira/browse/SPARK-4811
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.1.0, 1.1.1
Reporter: Saurabh Santhosh
Priority: Critical

 I am using the Thrift srever interface to Spark SQL and using beeline to 
 connect to it.
 I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
 following exception when using any custom UDTF.
 These are the steps i did :
 *Created a UDTF 'com.x.y.xxx'.*
 Registered the UDTF using following query : 
 *create temporary function xxx as 'com.x.y.xxx'*
 The registration went through without any errors. But when i tried executing 
 the UDTF i got the following error.
 *java.lang.ClassNotFoundException: xxx*
 Funny thing is that Its trying to load the function name instead of the 
 funtion class. The exception is at *line no: 81 in hiveudfs.scala*
 I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

2014-12-09 Thread Saurabh Santhosh (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saurabh Santhosh updated SPARK-4811:

Description: 
I am using the Thrift srever interface to Spark SQL and using beeline to 
connect to it.
I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the following 
exception when using any custom UDTF.

These are the steps i did :

*Created a UDTF 'com.x.y.xxx'.*

Registered the UDTF using following query : 
*create temporary function xxx as 'com.x.y.xxx'*

The registration went through without any errors. But when i tried executing 
the UDTF i got the following error.

*java.lang.ClassNotFoundException: xxx*

Funny thing is that Its trying to load the function name instead of the funtion 
class. The exception is at *line no: 81 in hiveudfs.scala*
I have been at it for quite a long time.


  was:
I am using the Thrift srever interface to Spark SQL and using beeline to 
connect to it.
I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the following 
exception when using any custom UDTF.

These are the steps i did :

*Created a UDTF 'com.x.y.xxx'.*

Registered the UDTF using following query : 
*create temporary function xxx as 'com.x.y.xxx'*

The registration went through without any errors. But when i tried executing 
the UDTF i got the following error.

*java.lang.ClassNotFoundException: xxx*

Funny thing is that Its trying to load the function name instead of the funtion 
class. The exception is at *line no: 81 in hiveudfs.scala*




 Custom UDTFs not working in Spark SQL
 -

 Key: SPARK-4811
 URL: https://issues.apache.org/jira/browse/SPARK-4811
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.1.0, 1.1.1
Reporter: Saurabh Santhosh
 Fix For: 1.2.0


 I am using the Thrift srever interface to Spark SQL and using beeline to 
 connect to it.
 I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the 
 following exception when using any custom UDTF.
 These are the steps i did :
 *Created a UDTF 'com.x.y.xxx'.*
 Registered the UDTF using following query : 
 *create temporary function xxx as 'com.x.y.xxx'*
 The registration went through without any errors. But when i tried executing 
 the UDTF i got the following error.
 *java.lang.ClassNotFoundException: xxx*
 Funny thing is that Its trying to load the function name instead of the 
 funtion class. The exception is at *line no: 81 in hiveudfs.scala*
 I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org