On Tue, 27 Nov 2001, Bob Okumura wrote:

> coredumpsize    unlimited
> cputime         unlimited
> datasize        16384 kbytes
> filesize        unlimited
> maxproc         96
> memorylocked    85425 kbytes
> memoryuse       256272 kbytes
> openfiles       128
> stacksize       2048 kbytes

I wouldn't know which of these would be the culprit, though the stack size
looks a bit small--you'd have to use ulimit to pick particular resources
to restrict to work out the exact problem. I'd look at stacksize,
memoryuse, memorylocked, and datasize. There's obviously only one htsearch
process and it doesn't open more than a few files.

> Any suggestions on what I might try to find a workaround?  Being on a
> virtual server, the server resources are not within my control.

No, though I'd suspect that the memory usage of 3.1.x should be less.

If you're having this much trouble with a tight memory configuration of
the virtual server, I'd suggest two courses of action:
1) Dump your ISP. See <http://www.htdig.org/isp.html> for a list of ISPs
who claim to offer ht://Dig hosting.
2) Think about other solutions. If your ISP offers mod_perl, then a Perl
search solution might be useful, though obviously not the same as
ht://Dig.

--
-Geoff Hutchison
Williams Students Online
http://wso.williams.edu/



_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to