[
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15692463#comment-15692463
]
liyunzhang_intel commented on HIVE-14825:
-----------------------------------------
[~ruili]: Will spark load all the jars in $HIVE_HOME/lib/ if i copy all the
$SPARK_HOME/jars/* to $HIVE_HOME/lib? when i read the code, i found hive only
send hive-exec*jar to spark.
> Figure out the minimum set of required jars for Hive on Spark after bumping
> up to Spark 2.0.0
> ---------------------------------------------------------------------------------------------
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
> Issue Type: Task
> Components: Documentation
> Reporter: Ferdinand Xu
> Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should
> figure out the minimum set of required jars for HoS to work after bumping up
> to Spark 2.0.0. By this way, users can decide whether they want to add just
> the required jars, or all the jars under spark's dir for convenience.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)