This is the first time I'm trying the Spark. I just downloaded and trying the
SimpleApp Java program using the maven. I added 2 maven dependencies --
spark-core and scala-library? Even though my program is in java, I was
forced to add the scala dependency. Is that really required?

Now, I'm able to build but while executing below line getting the error:

JavaSparkContext sc = new JavaSparkContext("local", "Simple App",
      "C:\\tmp\\spark-0.9.1-bin-cdh4", new
String[]{"target\\spark-hello-world-1.1.jar"});

java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
        at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:81)
        at SimpleApp.main(SimpleApp.java:8)
        at TestSimpleApp.testMain(TestSimpleApp.java:14)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at junit.framework.TestCase.runTest(TestCase.java:164)
        at junit.framework.TestCase.runBare(TestCase.java:130)
        at junit.framework.TestResult$1.protect(TestResult.java:110)
        at junit.framework.TestResult.runProtected(TestResult.java:128)
        at junit.framework.TestResult.run(TestResult.java:113)
        at junit.framework.TestCase.run(TestCase.java:120)
        at junit.framework.TestSuite.runTest(TestSuite.java:228)
        at junit.framework.TestSuite.run(TestSuite.java:223)
        at
org.junit.internal.runners.OldTestClassRunner.run(OldTestClassRunner.java:35)
        at
org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:59)
        at
org.apache.maven.surefire.suite.AbstractDirectoryTestSuite.executeTestSet(AbstractDirectoryTestSuite.java:120)
        at
org.apache.maven.surefire.suite.AbstractDirectoryTestSuite.execute(AbstractDirectoryTestSuite.java:103)
        at org.apache.maven.surefire.Surefire.run(Surefire.java:169)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.maven.surefire.booter.SurefireBooter.runSuitesInProcess(SurefireBooter.java:350)
        at
org.apache.maven.surefire.booter.SurefireBooter.main(SurefireBooter.java:1021)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
        ... 26 more



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-NoClassDefFoundError-org-apache-hadoop-io-Writable-tp6131.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to