Thank you for pasting the steps, I will look at this, hopefully come out with a 
solution soon.

-----Original Message-----
From: linkpatrickliu [mailto:linkpatrick...@live.com] 
Sent: Tuesday, September 16, 2014 3:17 PM
To: u...@spark.incubator.apache.org
Subject: RE: SparkSQL 1.1 hang when "DROP" or "LOAD"

Hi, Hao Cheng.

I have done other tests. And the result shows the thriftServer can connect to 
Zookeeper.

However, I found some more interesting things. And I think I have found a bug!

Test procedure:
Test1:
(0) Use beeline to connect to thriftServer.
(1) Switch database "use dw_op1"; (OK)
The logs show that the thriftServer connected with Zookeeper and acquired locks.

(2) Drop table "drop table src;" (Blocked) The logs show that the thriftServer 
is "acquireReadWriteLocks".

Doubt:
The reason why I cannot drop table src is because the first SQL "use dw_op1"
have left locks in Zookeeper  unsuccessfully released.
So when the second SQL is acquiring locks in Zookeeper, it will block.

Test2:
Restart thriftServer.
Instead of switching to another database, I just drop the table in the default 
database;
(0) Restart thriftServer & use beeline to connect to thriftServer.
(1) Drop table "drop table src"; (OK)
Amazing! Succeed!
(2) Drop again!  "drop table src2;" (Blocked) Same error: the thriftServer is 
blocked in the "acquireReadWriteLocks"
phrase.

As you can see. 
Only the first SQL requiring locks can succeed.
So I think the reason is that the thriftServer cannot release locks correctly 
in Zookeeper.









--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-1-1-hang-when-DROP-or-LOAD-tp14222p14339.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to