D programs are a vital part of my home computer infrastructure. I run some 60 D processes at almost any time.... and have recently been running out of memory.

Each individual process eats ~30-100 MB, but that times 60 = trouble. They start off small, like 5 MB, and grow over weeks or months, so it isn't something I can easily isolate in a debugger after recompiling.

I'm pretty sure this is the result of wasteful code somewhere in my underlying libraries, but nothing is obviously jumping out at me in the code. So I want to look at some of my existing processes and see just what is responsible for this.

I tried attaching to one and `call gc_stats()` in gdb... and it segfaulted. Whoops.




I am willing to recompile and run again, though I need to actually use the programs, so if instrumenting makes them unusable it won't really help. Is there a magic --DRT- argument perhaps? Or some trick with gdb attaching to a running process I don't know?

What I'm hoping to do is get an idea of which line of code allocates the most that isn't subsequently freed.

Reply via email to