[EMAIL PROTECTED] writes: > Will wget build me such a copy of the entire site? Full interlinked > and spiderable?
The command to make the copy would be something like `wget --mirror --convert-links --html-extension URL'. I started wget with wget --mirror --convert-links --html-extension http://mydomain.com/ /home/www/web10/9 Its running since several hours and using in "top" now 65% of memory and shows 300 MByte Memory. How may I let wget make a file by file copy of the site? How may I stop it from running before out of memory? Thanks, Maggi
