Re: Error: Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

2015-08-16 Thread Rishi Yadav
try --jars rather than --class to submit jar.



On Fri, Aug 14, 2015 at 6:19 AM, Stephen Boesch java...@gmail.com wrote:

 The NoClassDefFoundException differs from ClassNotFoundException : it
 indicates an error while initializing that class: but the class is found in
 the classpath. Please provide the full stack trace.

 2015-08-14 4:59 GMT-07:00 stelsavva stel...@avocarrot.com:

 Hello, I am just starting out with spark streaming and Hbase/hadoop, i m
 writing a simple app to read from kafka and store to Hbase, I am having
 trouble submitting my job to spark.

 I 've downloaded Apache Spark 1.4.1 pre-build for hadoop 2.6

 I am building the project with mvn package

 and submitting the jar file with

  ~/Desktop/spark/bin/spark-submit --class org.example.main.scalaConsumer
 scalConsumer-0.0.1-SNAPSHOT.jar

 And then i am getting the error you see in the subject line. Is this a
 problem with my maven dependencies? do i need to install hadoop locally?
 And
 if so how can i add the hadoop classpath to the spark job?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Error-Exception-in-thread-main-java-lang-NoClassDefFoundError-org-apache-hadoop-hbase-HBaseConfiguran-tp24266.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Re: Error: Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

2015-08-14 Thread Stephen Boesch
The NoClassDefFoundException differs from ClassNotFoundException : it
indicates an error while initializing that class: but the class is found in
the classpath. Please provide the full stack trace.

2015-08-14 4:59 GMT-07:00 stelsavva stel...@avocarrot.com:

 Hello, I am just starting out with spark streaming and Hbase/hadoop, i m
 writing a simple app to read from kafka and store to Hbase, I am having
 trouble submitting my job to spark.

 I 've downloaded Apache Spark 1.4.1 pre-build for hadoop 2.6

 I am building the project with mvn package

 and submitting the jar file with

  ~/Desktop/spark/bin/spark-submit --class org.example.main.scalaConsumer
 scalConsumer-0.0.1-SNAPSHOT.jar

 And then i am getting the error you see in the subject line. Is this a
 problem with my maven dependencies? do i need to install hadoop locally?
 And
 if so how can i add the hadoop classpath to the spark job?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Error-Exception-in-thread-main-java-lang-NoClassDefFoundError-org-apache-hadoop-hbase-HBaseConfiguran-tp24266.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Error: Exception in thread main java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

2015-08-14 Thread stelsavva
Hello, I am just starting out with spark streaming and Hbase/hadoop, i m
writing a simple app to read from kafka and store to Hbase, I am having
trouble submitting my job to spark.

I 've downloaded Apache Spark 1.4.1 pre-build for hadoop 2.6

I am building the project with mvn package

and submitting the jar file with 

 ~/Desktop/spark/bin/spark-submit --class org.example.main.scalaConsumer
scalConsumer-0.0.1-SNAPSHOT.jar 

And then i am getting the error you see in the subject line. Is this a
problem with my maven dependencies? do i need to install hadoop locally? And
if so how can i add the hadoop classpath to the spark job?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-Exception-in-thread-main-java-lang-NoClassDefFoundError-org-apache-hadoop-hbase-HBaseConfiguran-tp24266.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org