FYI, in case anybody else has this problem, we switched to Spark 1.1
(outside CDH) and the same Spark application worked first time (once
recompiled with Spark 1.1 libs of course). I assume this is because Spark
1.1 is compiled with Hive.
On 29 September 2014 17:41, Patrick McGloin
wrote:
> Hi,
Hi,
I have an error when submitting a Spark SQL application to our Spark
cluster:
14/09/29 16:02:11 WARN scheduler.TaskSetManager: Loss was due to
java.lang.NoClassDefFoundError
*java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/JobConf*
at
org.apache.spark.sql.hive.SparkHiveHadoop