Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/12395#issuecomment-212851362
OK, so there's not actually a hard reason to enforce a minimum of 300MB --
it's not "accidentally" being enforced now. Indeed that's probably too high a
minimum. Hm. At least, we can put back the move of the 300MB constant since it
is really specific to the UnifiedMemoryManager. The classic memory manager path
could enforce a much lower minimum ... 32MB? something below which we think
Spark can't reasonably run and is almost surely an input typo.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]