> > I'm trying to debug this (opengl) application I'm writing.
> > Now something odd happens: when I run it in gdb, it occasionally
> > sigsegvs which is not ok but expected, but when ran under valgrind the
> > whole system crashes.
> > I think it starts swapping like hell, but far more than the usual out
> > of memory situation because not even the mouse cursor moves.
> > So what I would like to know: is it possible to let valgrind limit the
> > amount of memory the 'guest application' uses? Could not find this in
> > the man page.
>
> Which Valgrind version are you using ?
> 3.7.0 contains a fix related to memory usage (bug 250101).

I'm using 3.7.0 - the one currently in debian testing.

> If you still have a problem, ulimit -d .... will limit the total memory
> used by Valgrind and the guest application.

Tried that but that did not help.

I'm also not entirely sure if it indeed is a leak in my program as I
have a routine in it which constantly (20 times/sec) checks the memory
usage (via /proc) of my program and does an exit() when it reaches
500MB (normally it should not use more than 120MB).
If it is something in Xorg, would valgrind "see" this? I think it
would (not sure, that's why I ask) as it would be the outcome of a
call to it (via SDL/GL).


Thanks.

--
www.vanheusden.com
bitcoin account: 14ExronPRN44urf4jqPMyoAN46T75MKGgP
msn address: [email protected]

------------------------------------------------------------------------------
For Developers, A Lot Can Happen In A Second.
Boundary is the first to Know...and Tell You.
Monitor Your Applications in Ultra-Fine Resolution. Try it FREE!
http://p.sf.net/sfu/Boundary-d2dvs2
_______________________________________________
Valgrind-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/valgrind-users

Reply via email to