I believe you can adjust it by setting the following:

spark.akka.timeout 100s Communication timeout between Spark nodes.

HTH.

-Todd



On Thu, Apr 21, 2016 at 9:49 AM, yuemeng (A) <yueme...@huawei.com> wrote:

> When I run a spark application,sometimes I get follow ERROR:
>
> 16/04/21 09:26:45 ERROR SparkContext: Error initializing SparkContext.
>
> java.util.concurrent.TimeoutException: Futures timed out after [10000
> milliseconds]
>
>          at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>
>          at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>
>          at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>
>          at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>
>          at scala.concurrent.Await$.result(package.scala:107)
>
>          at akka.remote.Remoting.start(Remoting.scala:180)
>
>          at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>
>          at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618)
>
>          at
> akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615)
>
>          at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615)
>
>          at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632)
>
>          at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
>
>          at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
>
>          at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
>
>          at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>
>          at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>
>          at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1995)
>
>          at
> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>
>          at
> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1986)
>
>          at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>
>          at
> org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
>
>
>
>
>
> AND I track the code ,I think if we update akka.remote.startup-timeout
> mabe solve this problem,but I can’t find any way to change this,
>
> Do anybody met this problem and know how to change akka config in spark?
>
> Thanks a lot
>
>
>
> *岳猛(Rick) 00277916*
>
> *大数据技术开发部*
>
>
> *****************************************************************************
>
> [image: cid:image012.jpg@01D0D9C8.DDEDCC20]*文档包*
> <http://platformdoc.huawei.com/hedex/hwdc/doc/docInfo.jsp?productId=2472&type=doc>
>
> [image: cid:image009.png@01D0DA69.58E5C9A0]*培训中心*
> <http://3ms.huawei.com/hi/index.php?app=group&mod=Core&act=showSectionData&gid=2031037&id=1250433>
>
> [image: cid:image010.png@01D0DA69.58E5C9A0]*案例库*
> <http://3ms.huawei.com/hi/group/2031037/threads.html#category=1179741>
>
>   *中软大数据3ms团队: **http://3ms.huawei.com/hi/group/2031037*
> <http://3ms.huawei.com/hi/group/2031037>
>
>
>
>
>

Reply via email to