It might mean one of your JARs is corrupted. Try doing sbt clean and then sbt 
assembly again.

Matei

On Nov 12, 2013, at 10:48 AM, Josh Rosen <[email protected]> wrote:

> I've seen this "error: error while loading <root>, error in opening zip file" 
> before, but I'm not exactly sure what causes it.  Here's a JIRA discussing 
> that error in earlier versions of Spark: 
> https://spark-project.atlassian.net/browse/SPARK-692
> 
> 
> On Tue, Nov 12, 2013 at 10:44 AM, Andre Schumacher 
> <[email protected]> wrote:
> 
> Hi,
> 
> Have you tried it in local mode?
> 
> The error message seems to indicate problems with the classpath. So you
> may want to make sure that all the binaries are present on all nodes at
> the same location (in addition making sure the {spark,shark}-env.sh
> settings are correct on all nodes).
> 
> Andre
> 
> On 11/12/2013 02:57 AM, Kapil Malik wrote:
> > Hi all,
> >
> > I've a standalone spark + shark cluster, following steps on 
> > https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster
> > $SPARK_HOME/spark-shell works fine. I am able to access files, and perform 
> > operations.
> > $SHARK_HOME/bin/shark also works fine, I am able to access hive tables and 
> > perform operations.
> >
> > However,
> > $SHARK_HOME/bin/shark-shell does not work :(
> >
> > Logs -
> > Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.6.0_27)
> > Initializing interpreter...
> > error: error while loading <root>, error in opening zip file
> > Failed to initialize compiler: object scala not found.
> >
> > I've set SCALA_HOME correctly in shark-env.sh and also in spark-env.sh.
> >
> > This failure restricts me to use sql2rdd operations. Can you please 
> > suggests any steps to trouble shoot this OR make sql2rdd work with 
> > spark-shell (if not shark-shell)
> >
> > Any suggestions ?
> >
> > Thanks and regards,
> >
> > Kapil Malik | [email protected]<mailto:[email protected]>
> >
> >
> >
> 
> 

Reply via email to