I have also tried to build the assembly from source but I get the following
error in this case (both from spark-shell and zeppelin)

java.lang.SecurityException: Invalid signature file digest for Manifest
main attributes
at
sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:284)
at
sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:238)
at java.util.jar.JarVerifier.processEntry(JarVerifier.java:273)
at java.util.jar.JarVerifier.update(JarVerifier.java:228)
at java.util.jar.JarFile.initializeVerifier(JarFile.java:383)
at java.util.jar.JarFile.getInputStream(JarFile.java:450)
at sun.misc.JarIndex.getJarIndex(JarIndex.java:137)
at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:839)
at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:831)
at java.security.AccessController.doPrivileged(Native Method)
at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:830)
at sun.misc.URLClassPath$JarLoader.<init>(URLClassPath.java:803)
at sun.misc.URLClassPath$3.run(URLClassPath.java:530)
at sun.misc.URLClassPath$3.run(URLClassPath.java:520)
at java.security.AccessController.doPrivileged(Native Method)
at sun.misc.URLClassPath.getLoader(URLClassPath.java:519)
at sun.misc.URLClassPath.getLoader(URLClassPath.java:492)
at sun.misc.URLClassPath.getNextLoader(URLClassPath.java:457)
at sun.misc.URLClassPath.access$100(URLClassPath.java:64)
at sun.misc.URLClassPath$1.next(URLClassPath.java:239)
at sun.misc.URLClassPath$1.hasMoreElements(URLClassPath.java:250)
at java.net.URLClassLoader$3$1.run(URLClassLoader.java:601)
at java.net.URLClassLoader$3$1.run(URLClassLoader.java:599)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader$3.next(URLClassLoader.java:598)
at java.net.URLClassLoader$3.hasMoreElements(URLClassLoader.java:623)
at sun.misc.CompoundEnumeration.next(CompoundEnumeration.java:45)
at sun.misc.CompoundEnumeration.hasMoreElements(CompoundEnumeration.java:54)
at sun.misc.CompoundEnumeration.next(CompoundEnumeration.java:45)
at sun.misc.CompoundEnumeration.hasMoreElements(CompoundEnumeration.java:54)
at
com.typesafe.config.impl.Parseable$ParseableResources.rawParseValue(Parseable.java:582)
at
com.typesafe.config.impl.Parseable$ParseableResources.rawParseValue(Parseable.java:554)
at com.typesafe.config.impl.Parseable.parseValue(Parseable.java:176)
at com.typesafe.config.impl.Parseable.parseValue(Parseable.java:170)
at com.typesafe.config.impl.Parseable.parse(Parseable.java:227)
at com.typesafe.config.impl.ConfigImpl$1.call(ConfigImpl.java:368)
at com.typesafe.config.impl.ConfigImpl$1.call(ConfigImpl.java:365)
at
com.typesafe.config.impl.ConfigImpl$LoaderCache.getOrElseUpdate(ConfigImpl.java:58)
at
com.typesafe.config.impl.ConfigImpl.computeCachedConfig(ConfigImpl.java:86)
at com.typesafe.config.impl.ConfigImpl.defaultReference(ConfigImpl.java:365)
at
com.typesafe.config.ConfigFactory.defaultReference(ConfigFactory.java:423)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:160)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:505)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:342)
at
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:144)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:484)
at
org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:312)
at org.apache.zeppelin.scheduler.Job.run(Job.java:171)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

2016-02-17 23:11 GMT+01:00 vincent gromakowski <
vincent.gromakow...@gmail.com>:

> I have found the error comes from the spark-cassandra assembly jar that I
> have built from snapshot. I am going crazy with this connector...
>
> 2016-02-17 22:44 GMT+01:00 Felix Cheung <felixcheun...@hotmail.com>:
>
>> It looks like the jar file is corrupted somehow?
>>
>> So Spark 1.6 works when you run be pre built official 0.5.6 release?
>>
>>
>>
>>
>>
>> On Wed, Feb 17, 2016 at 10:53 AM -0800, "vincent gromakowski" <
>> vincent.gromakow...@gmail.com> wrote:
>>
>> Hi all,
>> I have built Zeppelin from source (master branch) but I cannot get it
>> work with Spark.
>> Here is my build command :
>>
>> mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6
>>
>> And I get the following error with Spark 1.6 (pre built distribution 1.6
>> with Hadoop from Spark site)
>>
>> java.lang.SecurityException: Invalid signature file digest for Manifest
>> main attributes
>> at
>> sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:284)
>> at
>> sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:238)
>> at java.util.jar.JarVerifier.processEntry(JarVerifier.java:273)
>> at java.util.jar.JarVerifier.update(JarVerifier.java:228)
>> at java.util.jar.JarFile.initializeVerifier(JarFile.java:383)
>> at java.util.jar.JarFile.getInputStream(JarFile.java:450)
>> at sun.misc.JarIndex.getJarIndex(JarIndex.java:137)
>> at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:839)
>> at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:831)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:830)
>> at sun.misc.URLClassPath$JarLoader.<init>(URLClassPath.java:803)
>> at sun.misc.URLClassPath$3.run(URLClassPath.java:530)
>> at sun.misc.URLClassPath$3.run(URLClassPath.java:520)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at sun.misc.URLClassPath.getLoader(URLClassPath.java:519)
>> at sun.misc.URLClassPath.getLoader(URLClassPath.java:492)
>> at sun.misc.URLClassPath.getNextLoader(URLClassPath.java:457)
>> at sun.misc.URLClassPath.access$100(URLClassPath.java:64)
>> at sun.misc.URLClassPath$1.next(URLClassPath.java:239)
>> at sun.misc.URLClassPath$1.hasMoreElements(URLClassPath.java:250)
>> at java.net.URLClassLoader$3$1.run(URLClassLoader.java:601)
>> at java.net.URLClassLoader$3$1.run(URLClassLoader.java:599)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader$3.next(URLClassLoader.java:598)
>> at java.net.URLClassLoader$3.hasMoreElements(URLClassLoader.java:623)
>> at sun.misc.CompoundEnumeration.next(CompoundEnumeration.java:45)
>> at
>> sun.misc.CompoundEnumeration.hasMoreElements(CompoundEnumeration.java:54)
>> at sun.misc.CompoundEnumeration.next(CompoundEnumeration.java:45)
>> at
>> sun.misc.CompoundEnumeration.hasMoreElements(CompoundEnumeration.java:54)
>> at
>> com.typesafe.config.impl.Parseable$ParseableResources.rawParseValue(Parseable.java:582)
>> at
>> com.typesafe.config.impl.Parseable$ParseableResources.rawParseValue(Parseable.java:554)
>> at com.typesafe.config.impl.Parseable.parseValue(Parseable.java:176)
>> at com.typesafe.config.impl.Parseable.parseValue(Parseable.java:170)
>> at com.typesafe.config.impl.Parseable.parse(Parseable.java:227)
>> at com.typesafe.config.impl.ConfigImpl$1.call(ConfigImpl.java:368)
>> at com.typesafe.config.impl.ConfigImpl$1.call(ConfigImpl.java:365)
>> at
>> com.typesafe.config.impl.ConfigImpl$LoaderCache.getOrElseUpdate(ConfigImpl.java:58)
>> at
>> com.typesafe.config.impl.ConfigImpl.computeCachedConfig(ConfigImpl.java:86)
>> at
>> com.typesafe.config.impl.ConfigImpl.defaultReference(ConfigImpl.java:365)
>> at
>> com.typesafe.config.ConfigFactory.defaultReference(ConfigFactory.java:423)
>> at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:160)
>> at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:505)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
>> at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
>> at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
>> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
>> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
>> at
>> org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:342)
>> at
>> org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:144)
>> at
>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:484)
>> at
>> org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
>> at
>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
>> at
>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
>> at
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:312)
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:171)
>> at
>> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> I have also tried to use pre built binary 0.5.6 but the cassandra
>> interpreter doesn't work...
>>
>
>

Reply via email to