Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7753#issuecomment-126786492
@jerryshao I don't think that's true. The executors metrics that get sent
back to the driver have both the max value, and the current value. So a custom
listener could already track the minimum memory usage if it wanted.
I wonder if we should even calculate the max at all inside the executor, or
just leave it to the listeners. On one hand, max is what you will usually
want, so might as well make it more convenient. OTOH, maybe it would be
cleaner to leave it to the listeners ... eg. I can already imagine in the
future we may want to track max memory usage during each stage, and I
definitely think that logic should be left to the listeners.
what do you think @liyezhang556520 ? I will think about it more, but now
I'm leaning towards just leaving max to the listeners ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]