wgetting a list of URLs on STDIN requires reading the entire file into memory before extracting a linked list of URLs into memory. I.e. the STDIN needs to have EOF before twice the input's memory is spent to store its URLs.

Among other things, this means wget can't deal with infinite URL lists:

$ yes http://example.com | wget -i-

I've patched wget to special-case reading from STDIN in retrieve_from_file.

Unfortunately my C's a bit rusty, and I wasn't sure how to create an appropriate set of tests.

See https://github.com/jnothman/wget (73e9da29)

Cheers, and thanks for a great util!

- Joel

Reply via email to