It means you did not exclude the Servlet APIs from some dependency in
your app, and one of them is bringing it in every time. Look at the
dependency tree and exclude whatever brings in javax.servlet.

It should be already available in Spark, and the particular
javax.servlet JAR from Oracle has signing info that you have to strip
out, or simply exclude the whole thing.

On Wed, Feb 4, 2015 at 1:20 AM, DEVAN M.S. <msdeva...@gmail.com> wrote:
> HI all,
> I need a help.
>
> When i am trying to run spark project it is showing that, "Exception in
> thread "main" java.lang.SecurityException: class
> "javax.servlet.ServletRegistration"'s signer information does not match
> signer information of other classes in the same package".
> After deleting "/home/devan/.ivy2/cache/javax.servlet" this folder the
> things are working...
>
> Don't know what happened..
> Please help, Because on each restart the same folder is coming there.
> Found this:
> http://stackoverflow.com/questions/2877262/java-securityexception-signer-information-does-not-match
>
> which one makes the conflict ?? These are the libraries using,
>
> libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
>
> libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.1.0"
>
> libraryDependencies += "com.googlecode.json-simple" % "json-simple" %
> "1.1.1"
>
> libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.1.0"
>
> libraryDependencies += "org.apache.spark" % "spark-graphx_2.10" % "1.1.0"
>
> libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
>
> libraryDependencies +="com.google.code.gson" % "gson" % "2.3.1"
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.6.0"
>
> libraryDependencies += "org.apache.spark" % "spark-streaming-mqtt_2.10" %
> "1.1.0"
>
> libraryDependencies += "org.tachyonproject" % "tachyon" % "0.5.0"

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to