Hello. let's run this program:
import core.sys.posix.unistd;
import std.stdio;
import core.memory;
void main () {
uint size = 1024*1024*300;
for (;;) {
auto buf = new ubyte[](size);
writefln("%s", size);
sleep(1);
size += 1024*1024*100;
buf = null;
GC.collect();
GC.minimize();
}
}
pretty innocent, right? i even trying to help GC here. but...
314572800
419430400
524288000
629145600
734003200
core.exception.OutOfMemoryError@(0)
oooops.
by the way, this is not actually "no more memory", this is "i'm out of
address space" (yes, i'm on 32-bit system, GNU/Linux).
the question is: am i doing something wrong here? how can i force GC to
stop eating my address space and reuse what it already has?
sure, i can use libc malloc(), refcounting, and so on, but the question
remains: why GC not reusing already allocated and freed memory?
signature.asc
Description: PGP signature
