H> I suppose forking would not be too hard, but dealing with output from
H> forked processes might be tricky. Also, people would expect `-r' to
H> "parallelize" as well, which would be harder yet.
OK, maybe add a section to the manual, showing that you have
considered pa
Dan Jacobson <[EMAIL PROTECTED]> writes:
> Phil> How about
> Phil> $ wget URI1 & wget URI2
>
> Mmm, OK, but unwieldy if many. I guess I'm thinking about e.g.,
> $ wget --max-parallel-fetches=11 -i url-list
> (hmm, with default=1 meaning not parallel, but sequential.)
I suppose forking would not b
Dan Jacobson <[EMAIL PROTECTED]> writes:
> Maybe add an option so e.g.,
> $ wget --parallel URI1 URI2 ...
> would get them at the same time instead of in turn.
You can always invoke Wget in parallel by using something like `wget
URI1 & wget URI2 &'. How would a `--parallel' option be different
f
Phil> How about
Phil> $ wget URI1 & wget URI2
Mmm, OK, but unwieldy if many. I guess I'm thinking about e.g.,
$ wget --max-parallel-fetches=11 -i url-list
(hmm, with default=1 meaning not parallel, but sequential.)
Maybe add an option so e.g.,
$ wget --parallel URI1 URI2 ...
would get them at the same time instead of in turn.