repo and it got replicated from the maven central
repo, but during the build process, maven picked up 0.13.1 instead of 0.13.1a.
Date: Wed, 10 Dec 2014 12:23:08 -0800
Subject: Re: Build Spark 1.2.0-rc1 encounter exceptions when running
HiveContext - Caused by: java.lang.ClassNotFoundException
Hi All,
I tried to include necessary libraries in SPARK_CLASSPATH in spark-env.sh to
include auxiliaries JARs and datanucleus*.jars from Hive, however, when I run
HiveContext, it gives me the following error:
Caused by: java.lang.ClassNotFoundException:
Apologize for the format, somehow it got messed up and linefeed were removed.
Here's a reformatted version.
Hi All,
I tried to include necessary libraries in SPARK_CLASSPATH in spark-env.sh to
include auxiliaries JARs and datanucleus*.jars from Hive, however, when I run
HiveContext, it gives me
Hi Andrew,
It looks like somehow you are including jars from the upstream Apache
Hive 0.13 project on your classpath. For Spark 1.2 Hive 0.13 support,
we had to modify Hive to use a different version of Kryo that was
compatible with Spark's Kryo version.