[ 
https://issues.apache.org/jira/browse/HIVE-14029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15518237#comment-15518237
 ] 

Ferdinand Xu edited comment on HIVE-14029 at 9/24/16 3:10 AM:
--------------------------------------------------------------

Hi [~xuefuz] These two dependencies (Jackson and Netty) are not required in 
build. It's required for the runtime. If you try to run some HoS job, it will 
fail to create Spark client since API changes in these two library. You can see 
failed queries above for the reference.



was (Author: ferd):
Hi [~xuefuz] These two dependencies are not required in build. It's required 
for the runtime. If you try to run some HoS job, it will fail to create Spark 
client since API changes in these two library. You can see failed queries above 
for the reference.

> Update Spark version to 2.0.0
> -----------------------------
>
>                 Key: HIVE-14029
>                 URL: https://issues.apache.org/jira/browse/HIVE-14029
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Ferdinand Xu
>            Assignee: Ferdinand Xu
>         Attachments: HIVE-14029.1.patch, HIVE-14029.2.patch, 
> HIVE-14029.3.patch, HIVE-14029.4.patch, HIVE-14029.5.patch, HIVE-14029.patch
>
>
> There are quite some new optimizations in Spark 2.0.0. We need to bump up 
> Spark to 2.0.0 to benefit those performance improvements.
> To update Spark version to 2.0.0, the following changes are required:
> * Spark API updates:
> ** SparkShuffler#call return Iterator instead of Iterable
> ** SparkListener -> JavaSparkListener
> ** InputMetrics constructor doesn’t accept readMethod
> ** Method remoteBlocksFetched and localBlocksFetched in ShuffleReadMetrics 
> return long type instead of integer
> * Dependency upgrade:
> ** Jackson: 2.4.2 -> 2.6.5
> ** Netty version: 4.0.23.Final -> 4.0.29.Final
> ** Scala binary version: 2.10 -> 2.11
> ** Scala version: 2.10.4 -> 2.11.8



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to