GitHub user lipzhu opened a pull request:
https://github.com/apache/spark/pull/22765
[SPARK-25757][Build] Upgrade netty-all from 4.1.17.Final to 4.1.30.Final
## What changes were proposed in this pull request?
Upgrade netty dependency from 4.1.17 to 4.1.30.
Explanation:
Currently when sending a ChunkedByteBuffer with more than 16 chunks over
the network will trigger a "merge" of all the blocks into one big transient
array that is then sent over the network. This is problematic as the total
memory for all chunks can be high (2GB) and this would then trigger an
allocation of 2GB to merge everything, which will create OOM errors.
And we can avoid this issue by upgrade the netty.
https://github.com/netty/netty/pull/8038
## How was this patch tested?
Manual tests in some spark jobs.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/lipzhu/spark SPARK-25757
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/22765.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #22765
----
commit 0753c0fe907070b50812b3fb40868dfcdb2bd969
Author: Zhu, Lipeng <lipzhu@...>
Date: 2018-10-17T11:58:05Z
upgrade netty-all from 4.1.17.Final to 4.1.30.Final
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]