Thanks, Soumitra Kumar,
I didn’t know why you put hbase-protocol.jar in SPARK_CLASSPATH, while add
hbase-protocol.jar, hbase-common.jar, hbase-client.jar, htrace-core.jar in
--jar, but it did work.
Actually, I put all these four jars in SPARK_CLASSPATH along with HBase conf
directory.
Great, it worked.
I don't have an answer what is special about SPARK_CLASSPATH vs --jars, just
found the working setting through trial an error.
- Original Message -
From: Fengyun RAO raofeng...@gmail.com
To: Soumitra Kumar kumar.soumi...@gmail.com
Cc: user@spark.apache.org,
+user@hbase
2014-10-15 20:48 GMT+08:00 Fengyun RAO raofeng...@gmail.com:
We use Spark 1.1, and HBase 0.98.1-cdh5.1.0, and need to read and write an
HBase table in Spark program.
I notice there are:
spark.driver.extraClassPath
spark.executor.extraClassPathproperties to manage extra
I am writing to HBase, following are my options:
export SPARK_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hbase/hbase-protocol.jar
spark-submit \
--jars