HeartSaVioR commented on PR #38853: URL: https://github.com/apache/spark/pull/38853#issuecomment-1333053315
Nice finding! Looks like your explanation makes sense. I've quickly googled and there is `reserve` which may shrink the memory as desired but doesn't seem to be guaranteed. (And it's something we have to fix the code in RocksDB which they may have their own valid reason to do so.) So I agree this is the best approach we can do, and the fix is straightforward. Just a couple of questions: 1) Have you run the fix with your production workload for a while and see there is no longer the same memory issue? 2) Did https://github.com/apache/spark/commit/b8b1fbc21c66348d25be3404d3f61099f2a7a9b5 help to figure out the "sort of leaking" memory, or did it report simply the resized memory size and didn't disclose the memory issue? Otherwise the fix itself looks OK to me. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
