[
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15688618#comment-15688618
]
Rui Li commented on HIVE-14825:
-------------------------------
Hi [~kellyzly], it's in the [Configuring
Hive|https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started#HiveonSpark:GettingStarted-ConfiguringHive]
part.
Suppose you run with spark on yarn, then you can link scala-library, spark-core
and spark-network-common to your hive lib.
> Figure out the minimum set of required jars for Hive on Spark after bumping
> up to Spark 2.0.0
> ---------------------------------------------------------------------------------------------
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
> Issue Type: Task
> Components: Documentation
> Reporter: Ferdinand Xu
> Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should
> figure out the minimum set of required jars for HoS to work after bumping up
> to Spark 2.0.0. By this way, users can decide whether they want to add just
> the required jars, or all the jars under spark's dir for convenience.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)