nickstanishadb commented on PR #44678:
URL: https://github.com/apache/spark/pull/44678#issuecomment-1900503295

   @dtenedor I just had another thought about making something like this more 
usable for python users. It would be awesome to also provide a utility for 
python users to inspect their current instantaneous memory usage.
   
   They could use this during `analyze()` to get the size of their in-scope 
libraries and, assuming things are comparable on the executor, could request 
that amount + X MB for any additional data they would collect. Additionally 
they could use this inside `eval()` and `terminate()` and possibly execute some 
conditional logic if they are close to their allocated memory limit.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to