I did something similar...
What I would do it wget the webpage, go through the page with RegEx
and look for <a href [^>] "[^"].iso" and stick the URLs into a text
file. Then feed the textfile to wget (-i switch) and you're done ;)

On 3/14/07, siegfried <[EMAIL PROTECTED]> wrote:
I'm downloading a lot of debian images using wget these days. Can anyone
suggest where I might look to find a little perl script that will download a
web page, look for all the links containing ".iso" and then reinvent wget to
download them all?



Assuming such a script does not exist, can someone recommend a good set of
packages and functions I would use to reinvent wget?



Thanks,

Siegfried




--
- Yitzchok Good

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to