Re: SparkContext startup time out

2014-07-26 Thread Anand Avati
I am bumping into this problem as well. I am trying to move to akka 2.3.x
from 2.2.x in order to port to Scala 2.11 - only akka 2.3.x is available in
Scala 2.11. All 2.2.x akka works fine, and all 2.3.x akka give the
following exception in new SparkContext. Still investigating why..

  java.util.concurrent.TimeoutException: Futures timed out after
[1 milliseconds]
  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
  at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
  at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
  at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
  at scala.concurrent.Await$.result(package.scala:107)
  at akka.remote.Remoting.start(Remoting.scala:180)
  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
  at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618)
  at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615)
  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615)




On Fri, May 30, 2014 at 6:33 AM, Pierre B 
pierre.borckm...@realimpactanalytics.com wrote:

 I was annoyed by this as well.
 It appears that just permuting the order of decencies inclusion solves this
 problem:

 first spark, than your cdh hadoop distro.

 HTH,

 Pierre



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p6582.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.



Re: SparkContext startup time out

2014-05-30 Thread Pierre B
I was annoyed by this as well.
It appears that just permuting the order of decencies inclusion solves this
problem:

first spark, than your cdh hadoop distro.

HTH,

Pierre



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p6582.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: SparkContext startup time out

2014-05-16 Thread Sophia
How did you deal with this problem, I have met with it these days.God bless
me.

Best regard,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5738.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: SparkContext startup time out

2014-05-16 Thread Sophia
How did you deal with this problem finally?I also met with it.
Best regards,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5739.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: SparkContext startup time out

2014-03-13 Thread velvia
By the way, this is the underlying error for me:

java.lang.VerifyError: (class:
org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker
signature:
(Ljava/util/concurrent/Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;)
Wrong return type in function
at
akka.remote.transport.netty.NettyTransport.init(NettyTransport.scala:282)
at
akka.remote.transport.netty.NettyTransport.init(NettyTransport.scala:239)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

I don't quite understand though, I ran the dependency-graph plugin and
nowhere is netty in my dependency chain.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p2669.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.