Github user zsxwing commented on the issue:

    https://github.com/apache/spark/pull/14961
  
    > @zsxwing you seem to understand this better, but is it that the default 
behavior changes and is probably a bad default now, or just that it's 
inappropriate for Spark?
    
    I don't have a theory now. But I guess we never release direct buffers 
created by Netty and count on GC to release them. As per the following 
description in this commit,
    
    > Add new buffer implementations that can be enabled with a system flag as 
optimizations. In this case no Cleaner is used at all and the user must ensure 
everything is always released.
    
    we need to disable using the `no cleaner` direct buffers, or fix all 
leaking places.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to