Hey Du. This is probably a question for spark/shark mailing list.

On Thu, Jun 20, 2013 at 12:30 PM, Du Li <[email protected]> wrote:

> Hi,
>
> I set up Mesos (0.11.0) on my local cluster running Ubuntu 12.04. Now
> without starting Spark (0.7.2) daemons, I was able to run Spark jobs
> directly on Mesos. However, to run Shark (0.7.0), I still had to start the
> Spark master/slave daemons. Otherwise, shark exit with an
> IllegalStateException. I have modified shark/conf/shark-env.sh to provide
> the Mesos library path. This seems reasonable since Shark eventually
> decomposes queries into Spark tasks. but Just to confirm, do I really need
> to start Spark daemons to run Shark?
>
> Thanks,
> Du
>
> --------------------
> $shark
> …
> shark> show tables;
> show tables;
> FAILED: Hive Internal Error: java.lang.IllegalStateException(Shutdown in
> progress)
> shark> Exception in thread "Thread-1"
> java.util.concurrent.TimeoutException: Futures timed out after [5000]
> milliseconds
> at akka.dispatch.DefaultPromise.ready(Future.scala:870)
> at akka.dispatch.DefaultPromise.result(Future.scala:874)
> at akka.dispatch.Await$.result(Future.scala:74)
> at spark.deploy.client.Client.stop(Client.scala:117)
> at
> spark.scheduler.cluster.SparkDeploySchedulerBackend.stop(SparkDeploySchedulerBackend.scala:43)
> at
> spark.scheduler.cluster.ClusterScheduler.stop(ClusterScheduler.scala:254)
> at spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:724)
> at spark.SparkContext.stop(SparkContext.scala:532)
> at shark.SharkEnv$.stop(SharkEnv.scala:115)
> at shark.SharkCliDriver$$anon$1.run(SharkCliDriver.scala:109)
>

Reply via email to