Hello everyone,

I'm trying to cut down on unnecessary downloading in some scripts I have, and in
most cases it will actually suffice to download a chunk of files, for ex the
first 5MB. I've looked thru all the wget options and the only remotely useful
option I found was -Q quota (--quota=quota):

Specify download quota for automatic retrievals. The value can be specified in
bytes (default), kilobytes (with k suffix), or megabytes (with m suffix). Note
that quota will never affect downloading a single file. So if you specify wget
-Q10k ftp://wuarchive.wustl.edu/ls-lR.gz, all of the ls-lR.gz will be
downloaded. The same goes even when several URLs are specified on the
command-line. However, quota is respected when retrieving either recursively, or
from an input file. Thus you may safely type wget -Q2m -i sites---download will
be aborted when the quota is exceeded.

Unfortunately, that option doesn't do what I thought it would (it doesn't work
on single files).

Googling the problem didn't seem to produce any helpful results. Can anyone
help?

I've also posted this question at
http://www.linuxquestions.org/questions/showthread.php?p=2601450 without much
luck.

Thank you in advance.

Sincerely,
Archon810.

Reply via email to