On Sat, 05 Mar 2005 Hrvoje Niksic wrote:
> -D filters the URLs encountered with -r.  Specifying an input file is
> the same as specifying those URLs on the command-line.  If you need to
> exclude domains from the input file, I guess you can use something
> like `grep -v'.

Hi Hrvoje,

thanks - but grep is no suitable option. I'd have to combine it with any
other perl/sed/awk first in order both to merge <a href tags longer than one
line and split lines that have more than one href within, just to make sure
that only the desired domains are listed.

It's sad that filter options do work on -r only, but not on -i (as a special
type of -r -l1).

Application example: I got a mailbox file which includes those URLs. I'd
like to download all from a certain site.

One workaround might be to convert this mailbox to a basic html file and
read if via http in order to force the -r -H -D branch, instead of using 
-D -F -i locally.


Is there a easier solution in order to tell wget that -i is 'within' the
recursive path?

Thanks,
Martin

Reply via email to