Jim has been helping me on this problem for some time now. We've not been able to
actually find the solution and so I thought I should ask here for ideas.

I have a list of 10,028 or so URL's. The machine is a 500Mhz, 512MB RAM with 8GB
of SWAP in the form of a very fast SCSI drive. The system sees the SWAP and so on
and puts it to use. Memory does not seem to be the issue, at least, how much
there is.

The dig is set to go about 4 levels deep, uncompressed and there is 100GB of free
space for the dig and more if it's needed.

When I run the dig, things go well for a few hours then the memory, slowly but
surely becomes used up, all of it. At first, we thought it was just running out
but then as saw that even with the giant SWAP, it was still running out of main
memory and finally would die. It's never clear why it's dying and I'm kind of
nervous to keep trying because it's using up the resources of the remote sites.

Has anyone seen anything where the dig would die after a day or two, using up all
of it's main memory and never releasing it? It seems to release a little bit but
keeps using it up again, then more and more.

The version I'm using right now is 3.2.b4.

Mike




-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to