On 05.01.2018 12:46, Suvendu Sekhar Mondal wrote:
I really never found any explanation behind this "initial=max" heap size
theory until I saw your mail; although I see this type of configuration in
most of the places. It will be awesome if you can tell more about benefits
of this configuration.
The thing is: You're allowing your application to sooner or later allocate *max* heap, no matter what. Granted, when the server just starts up, it might require less, but in the end (say, after an hour, a day or a couple of days, it might end up at max anyway. Server processes typically run long. And if your OS runs out of memory, not being able to grant those last Gigs of memory that you're asking for /at some random time/, it will trigger an out-of-memory condition. Instead of detecting such a condition (to add drama) Sunday night at 3am, why not catch it when your process just starts up? Java will never return memory to the heap, unless you stop the process. So you can't plan to use that RAM for anything else anyways. Imagine having 4G of available memory in your server. Now run your Java process with "initial=2G, max=8G". Try to predict when it'll fail.

For your desktop development machine, where you potentially constantly restart the server: Feel free to set different initial/max values. For a server that's supposed to be running long time: Admit that you're allocating that memory to tomcat anyways.
I usually do not set initial and max heap size to same value because
garbage collection is delayed until the heap is full. Therefore, the first
time that the GC runs, the process can take longer. Also, the heap is more
likely to be fragmented and require a heap compaction. To avoid that, till
now my strategy is to:
- Start application with the minimum heap size that application requires
- When the GC starts up, it runs frequently and efficiently because the
heap is small
- When the heap is full of live objects, the GC compacts the heap. If
sufficient garbage is still not recovered or any of the other conditions
for heap expansion are met, the GC expands the heap.
Are you sure (extra extra /extra/ sure) that this is indeed the specified condition under which the JVM allocates new memory from the OS? And that this condition is stable between different versions/releases/implementations?
Another thing, what if I know the server load varies a lot(from 10s in
night time to 10000s during day time) during different time frame, does
"initial=max heap" apply for that situation also?
After running for a day, you might end up with max allocation anyways. If this allocation is followed by a low load phase, you haven't gained anything. Java is not returning unused memory to the OS.

And if you're relying on your application never exceeding a certain amount of memory, there's another thing to consider for memory allocation:

Measure your application's memory requirement. Allocate enough memory for your application's highest demand (plus some security margin) and cap it there. You basically want the lowest amount of memory allocated that suits your application in the long run, to keep GC frequent and quick, rather than infrequent but slow. In testing, setting different initial/max values might help you to get closer to such a value. In production I wouldn't rely on it and rather want to know immediately (e.g. when starting the process) if enough memory is available - rather than Sunday night at 3am.

There's an argument that this was particularly necessary in the 32bit times, with the JVM demanding contiguous space. With 64bit address space, this particular aspect shouldn't be a problem any more - however the generally available memory (note: available to the process, not the hardware memory you stuck into your server) still can be an issue. You might want to conserve it.

Olaf

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org

Reply via email to