This usually means "something didn't start due to a fairly low-level
error", like a class not found or incompatible Spark versions
somewhere. At least, that's also what I see in unit tests when things
like that go wrong.

On Tue, Apr 14, 2015 at 8:06 AM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> Can you share a bit more information on the type of application that you are
> running? From the stacktrace i can only say, for some reason your connection
> timedout (prolly a GC pause or network issue)
>
> Thanks
> Best Regards
>
> On Wed, Apr 8, 2015 at 9:48 PM, Shuai Zheng <szheng.c...@gmail.com> wrote:
>>
>> Hi All,
>>
>>
>>
>> In some cases, I have below exception when I run spark in local mode (I
>> haven’t see this in a cluster). This is weird but also affect my local unit
>> test case (it is not always happen, but usually one per 4-5 times run). From
>> the stack, looks like error happen when create the context, but I don’t know
>> why and what kind of parameters that I can set to solve this issue.
>>
>>
>>
>> Exception in thread "main" java.util.concurrent.TimeoutException: Futures
>> timed
>>
>> out after [10000 milliseconds]
>>
>>         at
>> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>>
>>
>>
>>         at
>> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223
>>
>> )
>>
>>         at
>> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>>
>>         at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockConte
>>
>> xt.scala:53)
>>
>>         at scala.concurrent.Await$.result(package.scala:107)
>>
>>         at akka.remote.Remoting.start(Remoting.scala:180)
>>
>>         at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:
>>
>> 184)
>>
>>         at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618)
>>
>>         at
>> akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615)
>>
>>         at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615)
>>
>>         at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632)
>>
>>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
>>
>>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
>>
>>         at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doC
>>
>> reateActorSystem(AkkaUtils.scala:122)
>>
>>         at
>> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)
>>
>>         at
>> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>>
>>         at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$
>>
>> sp(Utils.scala:1832)
>>
>>         at
>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>
>>         at
>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1823)
>>
>>         at
>> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57
>>
>> )
>>
>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)
>>
>>         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
>>
>>         at
>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
>>
>>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:270)
>>
>>         at
>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.sc
>>
>> ala:61)
>>
>>         at com.***.executor.FinancialEngineExecutor.run(F
>>
>> inancialEngineExecutor.java:110)
>>
>>
>>
>> Regards,
>>
>>
>>
>> Shuai
>>
>>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to