[ 
https://issues.apache.org/jira/browse/HIVE-14029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15512499#comment-15512499
 ] 

Rui Li commented on HIVE-14029:
-------------------------------

Hi [~Ferd], My thought was we should be able to tell the user what are actually 
needed (i.e. the minimum set of required jars) for HoS to work. Users can 
decide whether they want to add just the required jars, or all the jars under 
spark's dir for convenience. This is just something good to have and doesn't 
block this ticket - we used to add the whole assembly anyway.
Besides, I think not all the dependencies of spark-core are needed because some 
of them should be already in Hive's classpath, e.g. hadoop, commons, etc.

> Update Spark version to 2.0.0
> -----------------------------
>
>                 Key: HIVE-14029
>                 URL: https://issues.apache.org/jira/browse/HIVE-14029
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Ferdinand Xu
>            Assignee: Ferdinand Xu
>         Attachments: HIVE-14029.1.patch, HIVE-14029.2.patch, 
> HIVE-14029.3.patch, HIVE-14029.4.patch, HIVE-14029.patch
>
>
> There are quite some new optimizations in Spark 2.0.0. We need to bump up 
> Spark to 2.0.0 to benefit those performance improvements.
> To update Spark version to 2.0.0, the following changes are required:
> * Spark API updates:
> ** SparkShuffler#call return Iterator instead of Iterable
> ** SparkListener -> JavaSparkListener
> ** InputMetrics constructor doesn’t accept readMethod
> ** Method remoteBlocksFetched and localBlocksFetched in ShuffleReadMetrics 
> return long type instead of integer
> * Dependency upgrade:
> ** Jackson: 2.4.2 -> 2.6.5
> ** Netty version: 4.0.23.Final -> 4.0.29.Final
> ** Scala binary version: 2.10 -> 2.11
> ** Scala version: 2.10.4 -> 2.11.8



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to