Github user jerryshao commented on the pull request:

    https://github.com/apache/spark/pull/8681#issuecomment-139123631
  
    Hi @zjffdu , is there any other we could get memory info of each executor, 
is passing to executor launch arguments and reporting back the only mechanism 
to achieve this? I'm guessing Spark driver has enough knowledge to know the 
memory size of each executor, like spark.executor.memory? I'm just wondered is 
there any other simple way to get this information.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to