Seems like this could be a version mismatch issue between the HBase version
deployed and the jars being used. 

Here are the details on the versions we have setup

We are running CDH-4.6.0 (which includes hadoop 2.0.0), and the spark was
compiled against that version. Below is environment variable we set before
compiling:
SPARK_HADOOP_VERSION=2.0.0+1554-cdh4.6.0

And the code being deployed is using the following maven dependency
                <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-core_2.10</artifactId>
                        <version>0.9.0-incubating</version>
                </dependency>

Thanks for your help.
Kanwal




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-HBase-external-table-on-freshly-created-EMR-cluster-tp2307p3004.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to