On Wednesday, 17 April 2019 at 16:27:02 UTC, Adam D. Ruppe wrote:
D programs are a vital part of my home computer infrastructure. I run some 60 D processes at almost any time.... and have recently been running out of memory.

Each individual process eats ~30-100 MB, but that times 60 = trouble. They start off small, like 5 MB, and grow over weeks or months, so it isn't something I can easily isolate in a debugger after recompiling.

I'm pretty sure this is the result of wasteful code somewhere in my underlying libraries, but nothing is obviously jumping out at me in the code. So I want to look at some of my existing processes and see just what is responsible for this.

I tried attaching to one and `call gc_stats()` in gdb... and it segfaulted. Whoops.




I am willing to recompile and run again, though I need to actually use the programs, so if instrumenting makes them unusable it won't really help. Is there a magic --DRT- argument perhaps? Or some trick with gdb attaching to a running process I don't know?

What I'm hoping to do is get an idea of which line of code allocates the most that isn't subsequently freed.


Curious, what are these programs?

You might have hook in to the GC and just write out stats, I believe there is a stats collector somewhere though.

I did this by replacing new and monitored and calculated the allocations. This didn't help for any GC issues but at least made sure all my allocations were correct.

What you could do is something similar to this and just output stuff to a text file(that is written every so often).

If they programs are not too large you could used named allocations that could then be graphed individually(or use the file locations, __FILE__, etc).

Search and replace all new's and allocs with your custom ones and override the GC's.

Should give you a good idea of what's going on.

Reply via email to