We use Spark 1.1, and HBase 0.98.1-cdh5.1.0, and need to read and write an
HBase table in Spark program.

I notice there are:
spark.driver.extraClassPath
spark.executor.extraClassPathproperties to manage extra ClassPath, over
even an deprecated SPARK_CLASSPATH.

The problem is what classpath or jars should we append?
I can simplely add the whole `hbase classpath`, which is huge,
but this leads to dependencies conflict, e.g. HBase uses guava-12 while
Spark uses guava-14.

Reply via email to