I referenced  Hive on Spark: Getting Started
<https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started>
  
to compile and configure my spark(1.5.1) and hive(1.2.1) and executed a
query in hive CLI, then I got the following error in hive.log:

15/11/13 09:01:37 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:37 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes
in memory (estimated size 40.6 KB, free 529.8 MB)
15/11/13 09:01:37 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:37 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory
on 192.168.181.200:52797 (size: 40.6 KB, free: 530.2 MB)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:37 ERROR util.Utils: uncaught error in thread SparkListenerBus,
stopping SparkContext
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:
java.lang.AbstractMethodError
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:62)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:56)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1136)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO spark.SparkContext: Created broadcast 0 from hadoopRDD at
SparkPlanGenerator.java:188
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/json,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs,null}
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO ui.SparkUI: Stopped Spark web UI at
http://192.168.181.200:4040
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO cluster.SparkDeploySchedulerBackend: Shutting down all
executors
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to
shut down
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO log.PerfLogger: </PERFLOG method=SparkCreateTran.Map 1
start=1447376493417 end=1447376498391 duration=4974
from=org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator>
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO log.PerfLogger: <PERFLOG method=SparkCreateTran.Reducer 2
from=org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator>
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO log.PerfLogger: <PERFLOG method=serializePlan
from=org.apache.hadoop.hive.ql.exec.Utilities>
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO exec.Utilities: Serializing ReduceWork via kryo
state = STARTED
15/11/13 09:01:38 [main]: INFO status.SparkJobMonitor: state = STARTED
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 WARN remote.ReliableDeliverySupervisor: Association with remote
system [akka.tcp://sparkExecutor@192.168.181.200:33004] has failed, address
is now gated for [5000] ms. Reason: [Disassociated] 
15/11/13 09:01:38 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:38 INFO log.PerfLogger: </PERFLOG method=serializePlan
start=1447376498435 end=1447376498825 duration=390
from=org.apache.hadoop.hive.ql.exec.Utilities>
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:39 INFO log.PerfLogger: </PERFLOG method=SparkCreateTran.Reducer 2
start=1447376498391 end=1447376499054 duration=663
from=org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator>
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:39 INFO log.PerfLogger: </PERFLOG method=SparkBuildPlan
start=1447376493415 end=1447376499054 duration=5639
from=org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator>
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:39 INFO log.PerfLogger: <PERFLOG method=SparkBuildRDDGraph
from=org.apache.hadoop.hive.ql.exec.spark.SparkPlan>
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:39 INFO log.PerfLogger: </PERFLOG method=SparkBuildRDDGraph
start=1447376499055 end=1447376499205 duration=150
from=org.apache.hadoop.hive.ql.exec.spark.SparkPlan>
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl: 15/11/13
09:01:39 INFO client.RemoteDriver: Failed to run job
a73c5dac-32c1-4df3-9f83-c715acf599bc
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:
java.lang.IllegalStateException: Cannot call methods on a stopped
SparkContext
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:104)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.SparkContext.submitJob(SparkContext.scala:1979)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$1.apply(AsyncRDDActions.scala:118)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$1.apply(AsyncRDDActions.scala:116)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.rdd.AsyncRDDActions.foreachAsync(AsyncRDDActions.scala:116)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.api.java.JavaRDDLike$class.foreachAsync(JavaRDDLike.scala:690)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.spark.api.java.AbstractJavaRDDLike.foreachAsync(JavaRDDLike.scala:47)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:257)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:366)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:335)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
java.util.concurrent.FutureTask.run(FutureTask.java:262)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
15/11/13 09:01:39 [stderr-redir-1]: INFO client.SparkClientImpl:        at
java.lang.Thread.run(Thread.java:745)
15/11/13 09:01:39 [RPC-Handler-3]: DEBUG rpc.KryoMessageCodec: Decoded
message of type org.apache.hive.spark.client.rpc.Rpc$MessageHeader (5 bytes)
15/11/13 09:01:39 [RPC-Handler-3]: DEBUG rpc.KryoMessageCodec: Decoded
message of type org.apache.hive.spark.client.BaseProtocol$JobResult (1570
bytes)
15/11/13 09:01:39 [RPC-Handler-3]: DEBUG rpc.RpcDispatcher: [ClientProtocol]
Received RPC message: type=CALL id=1
payload=org.apache.hive.spark.client.BaseProtocol$JobResult
15/11/13 09:01:39 [RPC-Handler-3]: INFO client.SparkClientImpl: Received
result for a73c5dac-32c1-4df3-9f83-c715acf599bc
15/11/13 09:01:39 [RPC-Handler-3]: DEBUG rpc.KryoMessageCodec: Encoded
message of type org.apache.hive.spark.client.rpc.Rpc$MessageHeader (5 bytes)
15/11/13 09:01:39 [RPC-Handler-3]: DEBUG rpc.KryoMessageCodec: Encoded
message of type org.apache.hive.spark.client.rpc.Rpc$NullMessage (2 bytes)
state = FAILED
15/11/13 09:01:39 [main]: INFO status.SparkJobMonitor: state = FAILED
Status: Failed
15/11/13 09:01:39 [main]: ERROR status.SparkJobMonitor: Status: Failed
15/11/13 09:01:39 [main]: INFO log.PerfLogger: </PERFLOG method=SparkRunJob
start=1447376487559 end=1447376499572 duration=12013
from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor>

How can I solve this problem?

Thanks in advance.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Issue-with-spark-on-hive-tp25372.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to