If we were to drop CDH4 / Hadoop 2.0.0-alpha, would this mean we do
not even to shade the hadoop fat jars, or we do still needed to
support 1.x ?

- Henry

On Thu, Feb 26, 2015 at 8:57 AM, Robert Metzger <rmetz...@apache.org> wrote:
> Hi,
>
> I'm currently working on https://issues.apache.org/jira/browse/FLINK-1605
> and its a hell of a mess.
>
> I got almost everything working, except for the hadoop 2.0.0-alpha profile.
> The profile exists because google protobuf has a different version in that
> Hadoop release.
> Since maven is setting the version of protobuf for the entire project to
> the older version, we have to use an older akka version which is causing
> issues.
>
> The logical conclusion from that would be shading Hadoop's protobuf version
> into the Hadoop jars. That by itself is working, however its not working
> for the "flink-yarn-tests".
>
> I think I can also solve the issue with the flink-yarn-tests, but it would
> be a very dirty hack (either injecting shaded code into the failsafe
> tests-classpath or putting test code into src/main).
>
> But the general question remains: Are we willing to continue spending a lot
> of time on maintaining the profile?
> Till has spend a lot of time recently to fix failing testcases for that old
> akka version, I spend almost two days now on getting the
> shading/dependencies right, and I'm sure we'll keep having troubles with
> the profile.
>
>
> Therefore, I was wondering if this is the right time to drop support for
> CDH4 / Hadoop 2.0.0-alpha.
>
>
> Best,
> Robert

Reply via email to