Todd Lipcon commented on PIG-924:

Hey guys,

Any word on this? From the packaging perspective it's pretty important that a 
single build of Hive will work with both Hadoop 18 and Hadoop 20. Obviously 
packaging isn't the Yahoo team's highest priority, but I think it is very 
important for community adoption, etc. If we require separate builds for 18 and 
20 it's one more thing that can cause confusion for new users.

As I understand it from Dmitriy, for this to work we just need to stop packing 
the Hadoop JAR into the pig JAR. Instead, the wrapper script just needs to 
specify the hadoop JAR on the classpath. Is there some barrier to doing this 
that I'm unaware of?

> Make Pig work with multiple versions of Hadoop
> ----------------------------------------------
>                 Key: PIG-924
>                 URL: https://issues.apache.org/jira/browse/PIG-924
>             Project: Pig
>          Issue Type: Bug
>            Reporter: Dmitriy V. Ryaboy
>         Attachments: pig_924.patch
> The current Pig build scripts package hadoop and other dependencies into the 
> pig.jar file.
> This means that if users upgrade Hadoop, they also need to upgrade Pig.
> Pig has relatively few dependencies on Hadoop interfaces that changed between 
> 18, 19, and 20.  It is possibly to write a dynamic shim that allows Pig to 
> use the correct calls for any of the above versions of Hadoop. Unfortunately, 
> the building process precludes us from the ability to do this at runtime, and 
> forces an unnecessary Pig rebuild even if dynamic shims are created.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

Reply via email to