Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22796#discussion_r227046597
  
    --- Diff: dev/deps/spark-deps-hadoop-2.7 ---
    @@ -148,7 +148,7 @@ metrics-graphite-3.1.5.jar
     metrics-json-3.1.5.jar
     metrics-jvm-3.1.5.jar
     minlog-1.3.0.jar
    -netty-3.9.9.Final.jar
    +netty-3.7.0.Final.jar
    --- End diff --
    
    I see, I think what's happening is that we still have a netty 3.x 
dependency from Hadoop. I guess we should simply leave that in place.
    
    I think that previously we had tried to manually exclude netty dependencies 
because it helped SBT resolve the version in the same way as Maven. I don't 
know if we actually want to take netty 3.x away from ZK and Hadoop if it wants 
them.
    
    At least, this change doesn't actually get rid of netty 3.x from the 
distribution, but we don't need to do that necessarily. I just wanted to make 
sure Spark itself has no reference to it nor is trying to manage it or depend 
on it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to