Hi

When downloading large files in web directory (directory listing enabled), with
wget -r http://example.com/folder/
On Solaris with 512mb memory, wget fails with:
wget: realloc: Failed to allocate 1073741824 bytes; memory exhausted.
On Windows with 1gb memory, cygwin-wget uses too much memory (more than 2gb).

Files are about 1gb each, and there're about 7-9 files. The same
problem still persists if add other options are added. eg
wget -r -c -np -nd -nH http://example.com/folder/

If wget was ran on just individual files, wget runs fine. eg.
wget -O in http://example.com/folder/
wget -i in -F --base=http://example.com/folder/ http://example.com/folder/

Reply via email to