So if I'm running Spark 2.0.2 built against Hadoop 2.6, I should be running
[Netty 4.0.29.Final](
https://github.com/apache/spark/blob/v2.0.2/dev/deps/spark-deps-hadoop-2.6#L141
<https://github.com/apache/spark/blob/553aac56bd5284e84391c05e2ef54d8bd7ad3a12/dev/deps/spark-deps-hadoop-2.6#L141>),
right?

And since [the Netty PR I'm interested in](
https://github.com/netty/netty/pull/5345) is tagged 4.0.37.Final, then I
guess Spark 2.0.2 isn't using a version of Netty that includes that PR.
This correlates with what I'm seeing in my environment (warnings related to
low entropy followed by executor failures).

OK cool! Thanks for the pointers.

Nick

On Mon, Dec 5, 2016 at 2:18 PM Sean Owen <so...@cloudera.com> wrote:

> netty should be Netty 3.x. It is all but unused but I couldn't manage to
> get rid of it: https://issues.apache.org/jira/browse/SPARK-17875
>
> netty-all should be 4.x, actually used.
>
>
> On Tue, Dec 6, 2016 at 12:54 AM Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
> I’m looking at the list of dependencies here:
>
>
> https://github.com/apache/spark/search?l=Groff&q=netty&type=Code&utf8=%E2%9C%93
>
> What’s the difference between netty and netty-all?
>
> The reason I ask is because I’m looking at a Netty PR
> <https://github.com/netty/netty/pull/5345> and trying to figure out if
> Spark 2.0.2 is using a version of Netty that includes that PR or not.
>
> Nick
> ​
>
>

Reply via email to