I recommend Lincoln Stein's book "Perl Networking". Even if you are too cheap to buy his book, you can google for it and download the source code for an example program that uses HTML::Parser to extract and download all the gif files from a page. His example actually parses the HTML and it sounds like you are not interested in that part.
I looked at WWW::Mechanize and was dismayed because it looked like it was extremely specific. It only had a few functions and was not general purpose. Siegfried -----Original Message----- From: Scott R. Godin [mailto:[EMAIL PROTECTED] Sent: Wednesday, September 14, 2005 9:33 PM To: beginners@perl.org; [EMAIL PROTECTED] Subject: Re: extract web pages from a web site José Pedro Silva Pinto wrote: > Hi there, > > I am doing a program in perl to extract some web pages (And copy it to a local file), from a given web address. > > Which perl module can I use to help me to do this task It depends on what you're looking to do... LWP::Simple to grab stuff with, WWW::Mechanize and HTML::TokeParser or HTML::Parser.. to interact with it and pick apart the results.. If you simply want to download and store the webpage wouldn't you also want to store the attendant image/css/javascript/embedded files that it references externally ? -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response> -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response>