Spark SQL 1.2 with CDH 4, Hive UDF is not working.

2014-12-22 Thread Ji ZHANG
Hi,

Recently I'm migrating from Shark 0.9 to Spark SQL 1.2, my CDH version
is 4.5, Hive 0.11. I've managed to setup Spark SQL Thriftserver, and
normal queries work fine, but custom UDF is not usable.

The symptom is when executing CREATE TEMPORARY FUNCTION, the query
hangs on a lock request:

14/12/22 14:41:57 DEBUG ClientCnxn: Reading reply
sessionid:0x34a6121e6d93e74, packet:: clientPath:null serverPath:null
finished:false header:: 289,8  replyHeader:: 289,51540866762,0
request:: '/hive_zookeeper_namespace_hive1/default,F  response::
v{'sample_07,'LOCK-EXCLUSIVE-0001565612,'LOCK-EXCLUSIVE-0001565957}
14/12/22 14:41:57 ERROR ZooKeeperHiveLockManager: conflicting lock
present for default mode EXCLUSIVE

Is it a compatibility issue because Spark SQL 1.2 is based on Hive
0.13? Is there a workaround instead of upgrading CDH or forbidding UDF
on Spark SQL?

Thanks.

-- 
Jerry

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark SQL 1.2 with CDH 4, Hive UDF is not working.

2014-12-22 Thread Cheng Lian

Hi Ji,

Spark SQL 1.2 only works with either Hive 0.12.0 or 0.13.1 due to Hive 
API/protocol compatibility issues. When interacting with Hive 0.11.x, 
connections and simple queries may succeed, but things may go crazy in 
unexpected corners (like UDF).


Cheng

On 12/22/14 4:15 PM, Ji ZHANG wrote:

Hi,

Recently I'm migrating from Shark 0.9 to Spark SQL 1.2, my CDH version
is 4.5, Hive 0.11. I've managed to setup Spark SQL Thriftserver, and
normal queries work fine, but custom UDF is not usable.

The symptom is when executing CREATE TEMPORARY FUNCTION, the query
hangs on a lock request:

14/12/22 14:41:57 DEBUG ClientCnxn: Reading reply
sessionid:0x34a6121e6d93e74, packet:: clientPath:null serverPath:null
finished:false header:: 289,8  replyHeader:: 289,51540866762,0
request:: '/hive_zookeeper_namespace_hive1/default,F  response::
v{'sample_07,'LOCK-EXCLUSIVE-0001565612,'LOCK-EXCLUSIVE-0001565957}
14/12/22 14:41:57 ERROR ZooKeeperHiveLockManager: conflicting lock
present for default mode EXCLUSIVE

Is it a compatibility issue because Spark SQL 1.2 is based on Hive
0.13? Is there a workaround instead of upgrading CDH or forbidding UDF
on Spark SQL?

Thanks.




-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org