Github user ryan-williams commented on the pull request:
https://github.com/apache/spark/pull/3524#issuecomment-65330404
OK, I think I understand the intention. You want to increase the amount
that the memory pool thinks you have from `myMemoryThreshold` to
`2*currentMemory`, hence requesting the difference. Maybe I'd just change the
comment from "Claim up to double our current memory from the shuffle memory
pool" to "Increase our memory pool footprint to double our current memory"?
re: `myMemoryThreshold` being initialized to "0 until it has at least 1000
elements", doesn't it [get initialized to
`spark.shuffle.spill.initialMemoryThreshold` (default
5MB)](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/collection/Spillable.scala#L55)?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]