On Wed, 5 Mar 2025 21:34:09 GMT, Alexander Zuev <kiz...@openjdk.org> wrote:

>> - Check that the calculated audio data size does not exceed available heap 
>> memory before committing to the rendering
>> - Add a test case
>
> Alexander Zuev has updated the pull request incrementally with two additional 
> commits since the last revision:
> 
>  - Merge branch 'JDK-8350813' of https://github.com/azuev-java/jdk into 
> JDK-8350813
>  - Add a treshold of 90% to avoid OOME because of the additional buffers 
> needed to process sound bank.

Looking at this from the GC perspective...

What I am concerned here is really hooking up to "observed" heap stats. It 
guesses what GC does, and it is not great for the non-core JDK code. Checking 
for `maxMemory` alone might be fine, it is straight up 
`CollectedHeap::max_capacity`, which is essentially Xmx. But checking on 
`freeMemory` is basically throwing ourselves at the mercy of GC doing prompt 
reclamations. I can envision the scenario where a super-lazy GC would delay the 
collection until the single-digit% of total memory is free, but the actual GC 
cycle would throw away a lot of garbage.

I.e. I think I can construct the case where:


maxMemory = 2048M (Xmx)
totalMemory = 2048M (actually expanded heap size)
freeMemory = 64M (lazy GC)
<floating garbage> = (2G - delta)


Then:


long maximumHeapSize = (long) ((Runtime.getRuntime().maxMemory() -
                    (Runtime.getRuntime().totalMemory() - 
Runtime.getRuntime().freeMemory())) * 0.9); 


...computes to measly 57M (check my math) and thus throws OOME prematurely.

So, I would say:
 1. Throw OOME on > maxMemory, because there is no practical way GC would be 
able to satisfy the allocation.
 2. Proceed to allocate otherwise, and let GC to either deal with it -- by 
triggering cleanup -- or throw OOME to indicate a failure of doing so.

-------------

PR Comment: https://git.openjdk.org/jdk/pull/23814#issuecomment-2732136470

Reply via email to