On 11/19/07, John Stanton <[EMAIL PROTECTED]> wrote:
> Joe Wilson wrote:

> > If a C program employs perfect 1:1 malloc'ing to free'ing, i.e., has no
> > memory leaks, then garbage collection is irrelevant to the topic of
> > memory fragmentation. It's not like C can employ a copying garbage
> > collector that moves memory blocks after free() without the knowledge
> > or participation of the host program. The malloc() call is where
> > fragmentation happens. Fragmentation in malloc depends on your allocation
> > strategy: first-fit, best-fit, short-lived versus long-lived pools,
> > per-allocation-size pools, statistical prediction, etc. Malloc must
> > try to guess where an allocation must go to try to prevent future
> > memory fragmentation.

> If you never execute a free your dynamic memory is essentially contiguous.

Not necessarily, and that was his point about where fragmentation happens.

Many of the common allocators maintain multiple size classes to reduce
degenerative fragmentation under most workloads.  If you allocate
several different sizes, your allocations will in fact be spread all
over the available memory pool, and therefore be fragmented without
ever calling free().

Most common allocators are optimized to reach a steady-state quickly,
so they have the least fragmentation necessary to handle most
arbitrary workloads.  That means putting up with some fragmentation so
that applications that don't leak memory at the interface level will
also not leak memory due to the allocator's internal management, no
matter what allocation pattern they use.  The allocation pattern used
by the application can still affect how much fragmentation there is,
of course.

The overall point here, though, is that even commonly implemented
malloc/free interfaces can be reliable enough to keep applications
running for years without trouble.  Completely deterministic behavior
is not required when probabilistic determinism is sufficient.  (Sorry,
I just had to use big words there.  IOW, building an application to
average perfect behavior is fine when you don't need to guarantee
perfect behavior at every arbitrary point.  Most applications don't
measurably benefit from such a guarantee.)

-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to