You can generate dependency tree using:

mvn dependency:tree

and grep for 'org.scala-lang' in the output to see if there is any clue.

Cheers

On Wed, Jul 29, 2015 at 5:14 PM, Benjamin Ross <[email protected]>
wrote:

>  Hello all,
>
> I’m new to both spark and scala, and am running into an annoying error
> attempting to prototype some spark functionality.  From forums I’ve read
> online, this error should only present itself if there’s a version mismatch
> between the version of scala used to compile spark and the scala version
> that I’m using.  However, that’s not the case for me.  I’m using scala
> 2.10.4, and spark was compiled against scala 2.10.x.  Perhaps I’m missing
> something here.
>
>
>
> Also, the NoClassDefFoundError presents itself when debugging in eclipse,
> but running directly via the jar, the following error appears:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> scala/collection/Seq
>
>         at com.latticeengines.test.CassandraTest.main(CassandraTest.scala)
>
> Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>         ... 1 more
>
>
>
> I am getting the following warning when trying to invoke maven, but it
> doesn’t seem to be related to the underlying issue:
>
> [INFO] Checking for multiple versions of scala
>
> [WARNING]  Expected all dependencies to require Scala version: 2.10.4
>
> [WARNING]  com.mycompany:test:2.0.5-SNAPSHOT requires scala version: 2.10.4
>
> [WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
>
> [WARNING]  org.spark-project.akka:akka-remote_2.10:2.3.4-spark requires
> scala version: 2.10.4
>
> [WARNING]  org.spark-project.akka:akka-actor_2.10:2.3.4-spark requires
> scala version: 2.10.4
>
> [WARNING]  org.spark-project.akka:akka-slf4j_2.10:2.3.4-spark requires
> scala version: 2.10.4
>
> [WARNING]  org.apache.spark:spark-core_2.10:1.4.1 requires scala version:
> 2.10.4
>
> [WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version:
> 2.10.0
>
> [WARNING] Multiple versions of scala libraries detected!
>
> [INFO] includes = [**/*.scala,**/*.java,]
>
>
>
> Here’s the code I’m trying to run:
>
>
>
> object CassandraTest {
>
>   def main(args: Array[String]) {
>
>     println("Hello, scala!")
>
>
>
>     val conf = new SparkConf(true).set("spark.cassandra.connection.host",
> "127.0.0.1").set(
>
>         "spark.driver.extraClassPath",
>
>
> "/home/bross/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.4/spark-cassandra-connector_2.10-1.2.4.jar;/home/bross/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.4/spark-cassandra-connector_2.10-1.2.4.jar;/home/bross/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar");
>
>
>
>     val sc = new SparkContext("local", "test", conf)
>
>     val sqlContext = new SQLContext(sc)
>
>     val df = sqlContext
>
>       .read
>
>       .format("org.apache.spark.sql.cassandra")
>
>       .options(Map( "table" -> "kv", "keyspace" -> "test"))
>
>       .load()
>
>     val w = Window.orderBy("value").rowsBetween(-2, 0)
>
>     df.select(mean("value").over(w))
>
>
>
>   }
>
> }
>
>
>
> Here’s my maven file:
>
> <?xml version="1.0" encoding="UTF-8"?>
>
> <project xmlns="http://maven.apache.org/POM/4.0.0"; xmlns:xsi="
> http://www.w3.org/2001/XMLSchema-instance";
>
>     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
> http://maven.apache.org/maven-v4_0_0.xsd";>
>
>
>
>     <modelVersion>4.0.0</modelVersion>
>
>     <artifactId>test</artifactId>
>
>     <packaging>jar</packaging>
>
>     <name>${component-name}</name>
>
>
>
>     <properties>
>
>         <component-name>le-sparkdb</component-name>
>
>         <hadoop.version>2.6.0.2.2.0.0-2041</hadoop.version>
>
>         <scala.version>2.10.4</scala.version>
>
>         <spark.version>1.4.1</spark.version>
>
>         <avro.version>1.7.7</avro.version>
>
>         <parquet.avro.version>1.4.3</parquet.avro.version>
>
>         <le.domain.version>2.0.5-SNAPSHOT</le.domain.version>
>
>         <le.common.version>2.0.5-SNAPSHOT</le.common.version>
>
>         <le.eai.version>2.0.5-SNAPSHOT</le.eai.version>
>
>         <spark.cassandra.version>1.2.4</spark.cassandra.version>
>
>     </properties>
>
>     <parent>
>
>         <groupId>com.mycompany</groupId>
>
>         <artifactId>le-parent</artifactId>
>
>         <version>2.0.5-SNAPSHOT</version>
>
>         <relativePath>le-parent</relativePath>
>
>     </parent>
>
>
>
>     <build>
>
>         <plugins>
>
>             <plugin>
>
>                 <groupId>org.scala-tools</groupId>
>
>                 <artifactId>maven-scala-plugin</artifactId>
>
>                 <executions>
>
>                     <execution>
>
>                         <goals>
>
>                             <goal>compile</goal>
>
>                             <goal>testCompile</goal>
>
>                         </goals>
>
>                     </execution>
>
>                 </executions>
>
>             </plugin>
>
>             <plugin>
>
>                 <groupId>org.apache.maven.plugins</groupId>
>
>                 <artifactId>maven-eclipse-plugin</artifactId>
>
>                 <version>${maven.eclipse.version}</version>
>
>                 <configuration>
>
>                     <downloadSources>true</downloadSources>
>
>                     <downloadJavadocs>true</downloadJavadocs>
>
>                     <projectnatures>
>
>
> <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>
>
> <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>
>                     </projectnatures>
>
>                     <buildcommands>
>
>
> <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>
>                     </buildcommands>
>
>                    <classpathContainers>
>
>
> <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>
>
> <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>
>                    </classpathContainers>
>
>                     <excludes>
>
>                         <exclude>org.scala-lang:scala-library</exclude>
>
>                         <exclude>org.scala-lang:scala-compiler</exclude>
>
>                     </excludes>
>
>                     <sourceIncludes>
>
>                         <sourceInclude>**/*.scala</sourceInclude>
>
>                         <sourceInclude>**/*.java</sourceInclude>
>
>                     </sourceIncludes>
>
>                 </configuration>
>
>             </plugin>
>
>             <plugin>
>
>                                 <groupId>org.apache.maven.plugins</groupId>
>
>                                 <artifactId>maven-jar-plugin</artifactId>
>
>                                 <configuration>
>
>                                    <archive>
>
>                                        <manifest>
>
>
> <mainClass>com.mycompany.test.CassandraTest</mainClass>
>
>                                        </manifest>
>
>                                    </archive>
>
>                                 </configuration>
>
>                     </plugin>
>
>         </plugins>
>
>         <sourceDirectory>src/main/scala</sourceDirectory>
>
>     </build>
>
>
>
>     <dependencies>
>
>         <dependency>
>
>             <groupId>com.twitter</groupId>
>
>             <artifactId>parquet-avro</artifactId>
>
>             <version>${parquet.avro.version}</version>
>
>         </dependency>
>
>         <dependency>
>
>             <groupId>org.apache.avro</groupId>
>
>             <artifactId>avro</artifactId>
>
>             <version>${avro.version}</version>
>
>         </dependency>
>
>         <dependency>
>
>             <groupId>org.scala-lang</groupId>
>
>             <artifactId>scala-library</artifactId>
>
>             <version>${scala.version}</version>
>
>         </dependency>
>
>         <dependency>
>
>             <groupId>org.apache.spark</groupId>
>
>             <artifactId>spark-core_2.10</artifactId>
>
>             <version>${spark.version}</version>
>
>         </dependency>
>
>         <dependency>
>
>                      <groupId>org.apache.spark</groupId>
>
>              <artifactId>spark-sql_2.10</artifactId>
>
>              <version>${spark.version}</version>
>
>         </dependency>
>
>         <dependency>
>
>             <groupId>com.mycompany</groupId>
>
>             <artifactId>le-domain</artifactId>
>
>             <version>${le.domain.version}</version>
>
>             <exclusions>
>
>                 <exclusion>
>
>                     <groupId>javax.servlet</groupId>
>
>                     <artifactId>servlet-api</artifactId>
>
>                 </exclusion>
>
>             </exclusions>
>
>         </dependency>
>
>         <dependency>
>
>             <groupId>com.mycompany</groupId>
>
>             <artifactId>le-common</artifactId>
>
>             <version>${le.common.version}</version>
>
>             <exclusions>
>
>                 <exclusion>
>
>                     <groupId>javax.servlet</groupId>
>
>                     <artifactId>servlet-api</artifactId>
>
>                 </exclusion>
>
>             </exclusions>
>
>         </dependency>
>
>         <dependency>
>
>                 <groupId>com.datastax.spark</groupId>
>
>                 <artifactId>spark-cassandra-connector_2.10</artifactId>
>
>                 <version>${spark.cassandra.version}</version>
>
>         </dependency>
>
>         <dependency>
>
>                 <groupId>com.datastax.spark</groupId>
>
>
> <artifactId>spark-cassandra-connector-java_2.10</artifactId>
>
>                 <version>${spark.cassandra.version}</version>
>
>         </dependency>
>
>     </dependencies>
>
>
>
> </project>
>
>
>
> Thanks so much for any input.
>
> Ben
>

Reply via email to