> Slightly off-topic:
> Would it rather be perl committing all that memory or httpd?

add some instrumentation and you'll find out. symon can be good
for this sort of thing (you can have it monitor memory/cpu use of
specific processes at a frequent interval and graph them).

> Can this be prevented one way or another?

login.conf

So I thought. I had read the man page a few times, but still lack the understanding where to best put the limits. Currently, the user sits on default, with
datasize-max=512M:\
datasize-cur=512M:\
:maxproc-max=128:\
:maxproc-cur=64:\
openfiles-cur=128:\
stacksize-cur=4M:\
The datasize doesn't seem to cut it, the free mem+swap usually is around 3.5G. Now I wonder which of the following is best employed limiting the system usage below die-off:
cputime, filesize, memoryuse, vmemoryuse?

Is there any further description, a link or a document available with a formula on 'how to prevent one's system from running out of resources at all cost'? That would be the greatest and best; then I could put that user into this class, and she could never bring down the system, right?

Thanks,

Uwe

Reply via email to