[GitHub] spark issue #18482: [SPARK-21262] Stop sending 'stream request' when shuffle...

2017-07-06 Thread jinxing64
Github user jinxing64 commented on the issue: https://github.com/apache/spark/pull/18482 Sure, I will update the document soon. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark issue #18482: [SPARK-21262] Stop sending 'stream request' when shuffle...

2017-07-06 Thread zsxwing
Github user zsxwing commented on the issue: https://github.com/apache/spark/pull/18482 In a second thought, I think we don't need this PR. We can disable `spark.reducer.maxReqSizeShuffleToMem` by default. Let's just document this configuration will break old shuffle service and the

[GitHub] spark issue #18482: [SPARK-21262] Stop sending 'stream request' when shuffle...

2017-07-04 Thread jinxing64
Github user jinxing64 commented on the issue: https://github.com/apache/spark/pull/18482 Very gentle ping @zsxwing , How do you think about this idea? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark issue #18482: [SPARK-21262] Stop sending 'stream request' when shuffle...

2017-06-30 Thread jinxing64
Github user jinxing64 commented on the issue: https://github.com/apache/spark/pull/18482 In current change, it i fetching big chunk in memory and then writing to disk and then release the memory. I made this change for below reasons: 1. The client shouldn't break old shuffle

[GitHub] spark issue #18482: [SPARK-21262] Stop sending 'stream request' when shuffle...

2017-06-30 Thread cloud-fan
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/18482 does this mean we have to fetch big chunks in memory and then writing to disk? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If