Siegfried Heintze wrote:
I recommend Lincoln Stein's book "Perl Networking".
Even if you are too cheap to buy his book, you can google for it and
download the source code for an example program that uses HTML::Parser to
extract and download all the gif files from a page. His example actually
parses the HTML and it sounds like you are not interested in that part.
I looked at WWW::Mechanize and was dismayed because it looked like it was
extremely specific. It only had a few functions and was not general purpose.
Siegfried
It depends on what you're looking to use it for -- if you need to manipulate a
foreign site via scripts and parse the return results, WWW::Mechanize is your
best bet.
example: http://www.webdragon.net/miscel/tinyurl.htm
it all breaks down to how you use it, and what you're trying to do -- try
grabbing a page with WWW::Mechanize and then dump the object with Data::Dumper,
and you'll see a *welath* of usable information.
if you're just looking to suck down pages though, probably LWP::Simple or even
the command line wget would be your pup.
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>