>
> I imagine it's not the only instance of this kind of problem people
> will ever encounter. Can you rebuild Spark with this particular
> release of Hive?


Unfortunately the Hive APIs that we use change to much from release to
release to make this possible.  There is a JIRA for compiling Spark SQL
against Hive 13: SPARK-2706
<https://issues.apache.org/jira/browse/SPARK-2706>.

if I try to add hive-exec-0.12.0-cdh5.0.3.jar to my SPARK_CLASSPATH, in
> order to get DeprecatedParquetInputFormat, I find out that there is an
> incompatibility in the SerDeUtils class.  Spark's Hive snapshot expects to
> find


Instead of including CDH's version of Hive, I'd try just including the Hive
jars for Parquet from here:
http://mvnrepository.com/artifact/com.twitter/parquet-hive-bundle/1.5.0

However, support for this is a work in progress.  You'll likely need to
make sure you have a version of Spark that includes this commit (added last
Friday)
https://github.com/apache/spark/commit/9016af3f2729101027e33593e094332f05f48d92

Another option would be to try this *experimental* patch: pr/1819.

Reply via email to