[EMAIL PROTECTED] wrote:

I ran wget (1.9.1) on Debian GNU/Linux to find out how many links my site had, and after "Queue count 66246, maxcount 66247" links, the wget process ran out of memory. Is there a way to set the persistent state to disk instead of memory so that all the system memory and cache is not slowly consumed until the process halts? My site may have 1 M to 2 M links.

hi oscar,

exactly how much memory does wget take? could you please try if the most recent version of wget (1.10.2) gives you the same problem?

--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi                          http://www.tortonesi.com

University of Ferrara - Dept. of Eng.    http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linux            http://www.deepspace6.net
Ferrara Linux User Group                 http://www.ferrara.linux.it

Reply via email to