On 17 Oct 2016, at 18:26, Ryan Blue 
<rb...@netflix.com<mailto:rb...@netflix.com>> wrote:

Are these changes that the Hive community has rejected? I don't see a 
compelling reason to have a long-term Spark fork of Hive.

More changes in hive that haven't been picked up

HIVE-11720 is needed to handle very long HTTP headers, which is exactly the 
kind of header Active Directory likes to generate

the other one fixes pom dependencies to stop groovy-all getting into to spark. 
That's been fixed in sparks own POMs; pushing into the spark hive fork ensures 
that nobody else gets it. This matters as you can abuse serialisation and have 
SparkContext.objectFile() run arbitrary code in the loader's process. Not ideal.


On Sat, Oct 15, 2016 at 5:27 AM, Steve Loughran 
<ste...@hortonworks.com<mailto:ste...@hortonworks.com>> wrote:

On 15 Oct 2016, at 01:28, Ryan Blue 
<rb...@netflix.com.INVALID<mailto:rb...@netflix.com.INVALID>> wrote:

The Spark 2 branch is based on this one: 

Didn't know this had moved.... I had an outstanding PR against patricks which 
should really go in, if not already taken up ( HIVE-11720 ; 
https://github.com/pwendell/hive/pull/2 )

IMO I think it would make sense if -somehow- that hive fork were in the ASF; 
it's got to be in sync with Spark releases, and its not been ideal for me in 
terms of getting one or two fixes in, the other one being culling
groovy 2.4.4 as an export ( 
https://github.com/steveloughran/hive/tree/stevel/SPARK-13471-groovy-2.4.4 )

I don't know if the hive team themselves would be up to having it in their 
repo, or if committership logistics would suit it anyway. Otherwise, 
approaching infra@ and asking for a forked repo is likely to work with a bit of 


On Fri, Oct 14, 2016 at 4:33 PM, Ethan Aubin 
<ethan.au...@gmail.com<mailto:ethan.au...@gmail.com>> wrote:
In an email thread [1] from Aug 2015, it was mentioned that the source
to org.spark-project.hive was at
https://github.com/pwendell/hive/commits/release-1.2.1-spark .
That branch has a 1.2.1.spark version but spark 2.0.1 uses
1.2.1.spark2. Could anyone point me to the repo for 1.2.1.spark2?
Thanks --Ethan


To unsubscribe e-mail: 

Ryan Blue
Software Engineer

Ryan Blue
Software Engineer

Reply via email to