Thanks for help.  I added the following dependency in my pom file and the 
problem went away.
                <dependency> <!-- default Netty -->
                          <groupId>io.netty</groupId>
                          <artifactId>netty</artifactId>
                          <version>3.6.6.Final</version>
                </dependency>
Ey-Chih
Date: Tue, 20 Jan 2015 16:57:20 -0800 Subject: Re: Spark 1.1.0 - spark-submit 
failedFrom: yuzhih...@gmail.com
To: eyc...@hotmail.com
CC: user@spark.apache.org

Please check which netty jar(s) are on the classpath.
NioWorkerPool(Executor workerExecutor, int workerCount) was added in netty 3.5.4

Cheers
On Tue, Jan 20, 2015 at 4:15 PM, ey-chih chow <eyc...@hotmail.com> wrote:
Hi,



I issued the following command in a ec2 cluster launched using spark-ec2:



~/spark/bin/spark-submit --class com.crowdstar.cluster.etl.ParseAndClean

--master spark://ec2-54-185-107-113.us-west-2.compute.amazonaws.com:7077

--deploy-mode cluster --total-executor-cores 4

file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar

/ETL/input/2015/01/10/12/10Jan2015.avro

file:///tmp/etl-admin/vertica/VERTICA.avdl

file:///tmp/etl-admin/vertica/extras.json

file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar



The command failed with the following error logs in Spark-UI.  Is there any

suggestion on how to fix the problem?  Thanks.



Ey-Chih Chow



======================================



Launch Command: "/usr/lib/jvm/java-1.7.0/bin/java" "-cp"

"/root/spark/work/driver-20150120200843-0000/spark-etl-0.0.1-SNAPSHOT.jar::::/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-api-jdo-3.2.1.jar:/root/spark/lib/datanucleus-core-3.2.2.jar:/root/spark/lib/datanucleus-rdbms-3.2.1.jar"

"-XX:MaxPermSize=128m"

"-Dspark.executor.extraLibraryPath=/root/ephemeral-hdfs/lib/native/"

"-Dspark.executor.memory=13000m" "-Dspark.akka.askTimeout=10"

"-Dspark.cores.max=4"

"-Dspark.app.name=com.crowdstar.cluster.etl.ParseAndClean"

"-Dspark.jars=file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar"

"-Dspark.executor.extraClassPath=/root/ephemeral-hdfs/conf"

"-Dspark.master=spark://ec2-54-203-58-2.us-west-2.compute.amazonaws.com:7077"

"-Dakka.loglevel=WARNING" "-Xms512M" "-Xmx512M"

"org.apache.spark.deploy.worker.DriverWrapper"

"akka.tcp://sparkwor...@ip-10-33-140-157.us-west-2.compute.internal:47585/user/Worker"

"com.crowdstar.cluster.etl.ParseAndClean"

"/ETL/input/2015/01/10/12/10Jan2015.avro"

"file:///tmp/etl-admin/vertica/VERTICA.avdl"

"file:///tmp/etl-admin/vertica/extras.json"

"file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar"

========================================



SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in

[jar:file:/root/spark/work/driver-20150120200843-0000/spark-etl-0.0.1-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in

[jar:file:/root/spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an

explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

15/01/20 20:08:45 INFO spark.SecurityManager: Changing view acls to: root,

15/01/20 20:08:45 INFO spark.SecurityManager: Changing modify acls to: root,

15/01/20 20:08:45 INFO spark.SecurityManager: SecurityManager:

authentication disabled; ui acls disabled; users with view permissions:

Set(root, ); users with modify permissions: Set(root, )

15/01/20 20:08:45 INFO slf4j.Slf4jLogger: Slf4jLogger started

15/01/20 20:08:45 ERROR actor.ActorSystemImpl: Uncaught fatal error from

thread [Driver-akka.actor.default-dispatcher-3] shutting down ActorSystem

[Driver]

java.lang.NoSuchMethodError:

org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V

        at

akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:282)

        at

akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:239)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at

sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

        at

sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

        at

akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)

        at scala.util.Try$.apply(Try.scala:161)

        at

akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)

        at

akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)

        at

akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)

        at scala.util.Success.flatMap(Try.scala:200)

        at

akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)

        at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)

        at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)

        at

scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)

        at scala.collection.Iterator$class.foreach(Iterator.scala:727)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)

        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)

        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)

        at

scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)

        at

akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)

        at

akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)

        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)

        at akka.actor.ActorCell.invoke(ActorCell.scala:456)

        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)

        at akka.dispatch.Mailbox.run(Mailbox.scala:219)

        at

akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)

        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)

        at

scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)

        at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)

        at

scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Exception in thread "main" java.util.concurrent.TimeoutException: Futures

timed out after [10000 milliseconds]

        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)

        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)

        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)

        at

scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)

        at scala.concurrent.Await$.result(package.scala:107)

        at akka.remote.Remoting.start(Remoting.scala:173)

        at

akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)

        at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)

        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)

        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)

        at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)

        at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)

        at

org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)

        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)

        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)

        at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)

        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)

        at 
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)

        at

org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:33)

        at 
org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)







--

View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-1-0-spark-submit-failed-tp21272.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.



---------------------------------------------------------------------

To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

For additional commands, e-mail: user-h...@spark.apache.org




                                          

Reply via email to