Re: parallel fetching

2004-07-22 Thread Dan Jacobson
H> I suppose forking would not be too hard, but dealing with output from H> forked processes might be tricky. Also, people would expect `-r' to H> "parallelize" as well, which would be harder yet. OK, maybe add a section to the manual, showing that you have considered pa

Re: parallel fetching

2004-07-21 Thread Hrvoje Niksic
Dan Jacobson <[EMAIL PROTECTED]> writes: > Phil> How about > Phil> $ wget URI1 & wget URI2 > > Mmm, OK, but unwieldy if many. I guess I'm thinking about e.g., > $ wget --max-parallel-fetches=11 -i url-list > (hmm, with default=1 meaning not parallel, but sequential.) I suppose forking would not b

Re: parallel fetching

2004-07-21 Thread Hrvoje Niksic
Dan Jacobson <[EMAIL PROTECTED]> writes: > Maybe add an option so e.g., > $ wget --parallel URI1 URI2 ... > would get them at the same time instead of in turn. You can always invoke Wget in parallel by using something like `wget URI1 & wget URI2 &'. How would a `--parallel' option be different f

Re: parallel fetching

2004-07-18 Thread Dan Jacobson
Phil> How about Phil> $ wget URI1 & wget URI2 Mmm, OK, but unwieldy if many. I guess I'm thinking about e.g., $ wget --max-parallel-fetches=11 -i url-list (hmm, with default=1 meaning not parallel, but sequential.)

parallel fetching

2004-07-13 Thread Dan Jacobson
Maybe add an option so e.g., $ wget --parallel URI1 URI2 ... would get them at the same time instead of in turn.