No, your error is right there in the logs. Unset SPARK_CLASSPATH.
On Fri, Sep 12, 2014 at 10:20 PM, freedafeng wrote:
> : org.apache.spark.SparkException: Found both spark.driver.extraClassPath
> and SPARK_CLASSPATH. Use only the former.
--
The same command passed in another quick-start vm (v4.7) which has hbase 0.96
installed. maybe there are some conflicts for the newer hbase version and
spark 1.1.0? just my guess.
Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-1-failure-cla
Newbie for Java. so please be specific on how to resolve this,
The command I was running is
$ ./spark-submit --driver-class-path
/home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/lib/spark-examples-1.1.0-hadoop2.3.0.jar
/home/cloudera/Downloads/spark-1.1.0-bin-hadoop2.3/examples/src/main/python/