[
https://issues.apache.org/jira/browse/SPARK-4852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14562718#comment-14562718
]
Cheng Lian commented on SPARK-4852:
-----------------------------------
I'm not sure how to fix this properly for golden answer generation, probably
just ask developers to manually copy Kryo 2.21 for now?... Upgrading Kryo looks
pretty scary since it has caused us lots of dependency hell troubles before and
we paid a lot efforts to shade it properly.
> Hive query plan deserialization failure caused by shaded hive-exec jar file
> when generating golden answers
> ----------------------------------------------------------------------------------------------------------
>
> Key: SPARK-4852
> URL: https://issues.apache.org/jira/browse/SPARK-4852
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.2.0
> Reporter: Cheng Lian
> Priority: Minor
>
> When adding Hive 0.13.1 support for Spark SQL Thrift server in PR
> [2685|https://github.com/apache/spark/pull/2685], Kryo 2.22 used by original
> hive-exec-0.13.1.jar was shaded by Kryo 2.21 used by Spark SQL because of
> dependency hell. Unfortunately, Kryo 2.21 has a known bug that may cause Hive
> query plan deserialization failure. This bug was fixed in Kryo 2.22.
> Normally, this issue doesn't affect Spark SQL because we don't even generate
> Hive query plan. But when running Hive test suites like
> {{HiveCompatibilitySuite}}, golden answer files must be generated by Hive,
> and thus triggers this issue. A workaround is to replace
> {{hive-exec-0.13.1.jar}} under {{$HIVE_HOME/lib}} with Spark's
> {{hive-exec-0.13.1a.jar}} and {{kryo-2.21.jar}} under
> {{$SPARK_DEV_HOME/lib_managed/jars}}. Then add {{$HIVE_HOME/lib}} to
> {{$HADOOP_CLASSPATH}}.
> Upgrading to some newer version of Kryo which is binary compatible with Kryo
> 2.22 (if there is one) may fix this issue.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]