Hi Andres,

If you're using the EC2 scripts to start your standalone cluster, you can
use "~/spark-ec2/copy-dir --delete ~/spark" to sync your jars across the
cluster. Note that you will need to restart the Master and the Workers
afterwards through "sbin/start-all.sh" and "sbin/stop-all.sh". If you're
not using the EC2 scripts, you will have to rsync the directory manually
(copy-dir just calls rsync internally).

-Andrew


2014-08-06 2:39 GMT-07:00 Akhil Das <ak...@sigmoidanalytics.com>:

> Looks like a netty conflict there, most likely you are having mutiple
> versions of netty jars (eg:
> netty-3.6.6.Final.jar, netty-3.2.2.Final.jar, netty-all-4.0.13.Final.jar),
> you only require 3.6.6 i believe. a quick fix would be to remove the rest
> of them.
>
> Thanks
> Best Regards
>
>
> On Wed, Aug 6, 2014 at 3:05 PM, Andres Gomez Ferrer <ago...@redborder.net>
> wrote:
>
>> Hi all,
>>
>> My name is Andres and I'm starting to use Apache Spark.
>>
>> I try to submit my spark.jar to my cluster using this:
>>
>> spark-submit --class "net.redborder.spark.RedBorderApplication" --master
>> spark://pablo02:7077 redborder-spark-selfcontained.jar
>>
>> But when I did it .. My worker die .. and my driver too!
>>
>> This is my driver log:
>>
>> [INFO] 2014-08-06 06:30:12,025 [Driver-akka.actor.default-dispatcher-3]
>>  akka.event.slf4j.Slf4jLogger applyOrElse - Slf4jLogger started
>> [INFO] 2014-08-06 06:30:12,061 [Driver-akka.actor.default-dispatcher-3]
>>  Remoting apply$mcV$sp - Starting remoting
>> [ERROR] 2014-08-06 06:30:12,089 [Driver-akka.actor.default-dispatcher-6]
>>  akka.actor.ActorSystemImpl apply$mcV$sp - Uncaught fatal error from thread
>> [Driver-akka.actor.default-dispatcher-3] shutting down ActorSystem [Driver]
>> java.lang.VerifyError: (class:
>> org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker
>> signature:
>> (Ljava/util/concurrent/Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;)
>> Wrong return type in function
>>  at
>> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:282)
>> at
>> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:239)
>>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
>>  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
>> Source)
>> at java.lang.reflect.Constructor.newInstance(Unknown Source)
>>  at
>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
>> at scala.util.Try$.apply(Try.scala:161)
>>  at
>> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
>> at
>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>>  at
>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>> at scala.util.Success.flatMap(Try.scala:200)
>>  at
>> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
>> at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)
>>  at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)
>> at
>> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
>>  at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>> at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>  at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>  at
>> scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
>> at
>> akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)
>>  at
>> akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>>  at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>>  at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>>  at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>  at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> [INFO] 2014-08-06 06:30:12,093 [Driver-akka.actor.default-dispatcher-5]
>>  akka.remote.RemoteActorRefProvider$RemotingTerminator apply$mcV$sp -
>> Shutting down remote daemon.
>> [INFO] 2014-08-06 06:30:12,095 [Driver-akka.actor.default-dispatcher-5]
>>  akka.remote.RemoteActorRefProvider$RemotingTerminator apply$mcV$sp -
>> Remote daemon shut down; proceeding with flushing remote transports.
>> [INFO] 2014-08-06 06:30:12,102 [Driver-akka.actor.default-dispatcher-3]
>>  Remoting apply$mcV$sp - Remoting shut down
>> [INFO] 2014-08-06 06:30:12,104 [Driver-akka.actor.default-dispatcher-3]
>>  akka.remote.RemoteActorRefProvider$RemotingTerminator apply$mcV$sp -
>> Remoting shut down.
>> [ERROR] [08/06/2014 06:30:22.065] [main] [Remoting] Remoting error:
>> [Startup timed out] [
>> akka.remote.RemoteTransportException: Startup timed out
>>  at
>> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
>> at akka.remote.Remoting.start(Remoting.scala:191)
>>  at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>>  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>>  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>>  at
>> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
>> at
>> org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:33)
>>  at
>> org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
>> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
>> [10000 milliseconds]
>>  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>>  at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>>  at scala.concurrent.Await$.result(package.scala:107)
>> at akka.remote.Remoting.start(Remoting.scala:173)
>>  ... 9 more
>> ]
>> [root@pablo02 ~]# cat
>> /opt/rb/var/spark-1.0.0/work/driver-20140806063010-0000/stdout
>> [INFO] 2014-08-06 06:30:12,025 [Driver-akka.actor.default-dispatcher-3]
>>  akka.event.slf4j.Slf4jLogger applyOrElse - Slf4jLogger started
>> [INFO] 2014-08-06 06:30:12,061 [Driver-akka.actor.default-dispatcher-3]
>>  Remoting apply$mcV$sp - Starting remoting
>> [ERROR] 2014-08-06 06:30:12,089 [Driver-akka.actor.default-dispatcher-6]
>>  akka.actor.ActorSystemImpl apply$mcV$sp - Uncaught fatal error from thread
>> [Driver-akka.actor.default-dispatcher-3] shutting down ActorSystem [Driver]
>> java.lang.VerifyError: (class:
>> org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker
>> signature:
>> (Ljava/util/concurrent/Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;)
>> Wrong return type in function
>>  at
>> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:282)
>> at
>> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:239)
>>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
>>  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
>> Source)
>> at java.lang.reflect.Constructor.newInstance(Unknown Source)
>>  at
>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
>> at scala.util.Try$.apply(Try.scala:161)
>>  at
>> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
>> at
>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>>  at
>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
>> at scala.util.Success.flatMap(Try.scala:200)
>>  at
>> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
>> at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)
>>  at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)
>> at
>> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
>>  at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>> at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>  at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>  at
>> scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
>> at
>> akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)
>>  at
>> akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>>  at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>>  at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>>  at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>  at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> [INFO] 2014-08-06 06:30:12,093 [Driver-akka.actor.default-dispatcher-5]
>>  akka.remote.RemoteActorRefProvider$RemotingTerminator apply$mcV$sp -
>> Shutting down remote daemon.
>> [INFO] 2014-08-06 06:30:12,095 [Driver-akka.actor.default-dispatcher-5]
>>  akka.remote.RemoteActorRefProvider$RemotingTerminator apply$mcV$sp -
>> Remote daemon shut down; proceeding with flushing remote transports.
>> [INFO] 2014-08-06 06:30:12,102 [Driver-akka.actor.default-dispatcher-3]
>>  Remoting apply$mcV$sp - Remoting shut down
>> [INFO] 2014-08-06 06:30:12,104 [Driver-akka.actor.default-dispatcher-3]
>>  akka.remote.RemoteActorRefProvider$RemotingTerminator apply$mcV$sp -
>> Remoting shut down.
>> [ERROR] [08/06/2014 06:30:22.065] [main] [Remoting] Remoting error:
>> [Startup timed out] [
>> akka.remote.RemoteTransportException: Startup timed out
>>  at
>> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
>> at akka.remote.Remoting.start(Remoting.scala:191)
>>  at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>>  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>>  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>>  at
>> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
>> at
>> org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:33)
>>  at
>> org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
>> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
>> [10000 milliseconds]
>>  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>>  at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>>  at scala.concurrent.Await$.result(package.scala:107)
>> at akka.remote.Remoting.start(Remoting.scala:173)
>>  ... 9 more
>> ]
>>
>> and I can see this on my worker log:
>>
>> [ERROR] 2014-08-06 06:30:22,523
>> [sparkWorker-akka.actor.default-dispatcher-2]
>>  org.apache.spark.deploy.worker.Worker logError - Worker registration
>> failed: Attempted to re-register worker at same address:
>> akka.tcp://sparkWorker@pablo02:54370
>>
>> Can someone help me? I want run my application on my cluster ..
>>
>> Regards!!
>>
>> --
>> Andres Gomez
>> Developer at Eneo Tecnologia.
>> C/ Manufactura 2, Edificio Euro, Oficina 3N
>> Mairena del Aljarafe - 41927 - Sevilla
>> Telf.- 955 60 11 60 / 619 04 55 18
>>
>
>

Reply via email to