On the surface it sounds again like https://issues.apache.org/jira/browse/SPARK-1949 but not sure it's exactly the same problem.
The build exclusions are intended to remove dependency on the old Netty artifact. I think the Maven build has it sorted out though. Are you using Maven? if so what do you see in mvn dependency:tree -- where does it turn up? On Mon, Jun 9, 2014 at 9:34 AM, toivoa <toivo....@gmail.com> wrote: > Using > <dependency> > <groupId>org.apache.spark</groupId> > <artifactId>spark-core_2.10</artifactId> > <version>1.0.0</version> > </dependency> > > I can create simple test and run under Eclipse. > But when I try to deploy on test server I have dependencies problems. > > 1. Spark requires > <artifactId>akka-remote_2.10</artifactId> > <version>2.2.3-shaded-protobuf</version> > > And this in turn requires > > <dependency> > <groupId>io.netty</groupId> > <artifactId>netty</artifactId> > <version>3.6.6.Final</version> > </dependency> > > 2. At the same time Spark itself requires > <artifactId>netty-parent</artifactId> > <version>4.0.17.Final</version> > > So now I have different Netty versions and I get either > > Exception in thread "main" java.lang.SecurityException: class > "javax.servlet.FilterRegistration"'s signer information does not match > signer information of other classes in the same package > > When using 3.6.6.Final > > > Or > > 14/06/09 16:08:10 ERROR ActorSystemImpl: Uncaught fatal error from thread > [spark-akka.actor.default-dispatcher-4] shutting down ActorSystem [spark] > java.lang.NoClassDefFoundError: org/jboss/netty/util/Timer > > When using 4.0.17.Final > > > > What I am doing wrong and how to solve problem? > > Thanks > toivo > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-0-0-Maven-dependencies-problems-tp7247.html > Sent from the Apache Spark User List mailing list archive at Nabble.com.