Something like this?
open(fh, "/Program Files/OptiPerl/url.alls") or die "Couldn't open URL
file!\n";
while (<fh>) {
        chomp;
        my $res = $mech->get( $_ );
        if ($res->code > 399) {
                print $_." is bad\n";
        }
}

-----Original Message-----
From: Brian Volk [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 13, 2004 11:51 AM
To: '[EMAIL PROTECTED]'
Subject: RE: LWP::Simple question 


I forgot to include [EMAIL PROTECTED] :-)

Hi Andy, 

I've been reading up on WWW::Mechanize for a while now and I still can't
figure out how to get the file that contains all my url's loaded...  can you
please let me know if I'm close?

my $urls  = "/Program Files/OptiPerl/url.alls";   <--- this file is a local
file that looks like this:

http://www.spartanchemical.com/sfa/MSDSRep.nsf/docid/68674917D62B62C3852567D
8004E39BE?OpenDocument
http://www.spartanchemical.com/sfa/MSDSRep.nsf/docid/0BA3D2C7AF46CE8D852567D
3004E6108?OpenDocument

Can I load this local text file (urls.all) so WWW::Mechanize will check each
link one after the other?  This local file consist of 4K links...  My
intensions then are to have the STDOUT create two files good.urls and.... of
course bad.urls.  

Thanks for your help!

B 




-----Original Message-----
From: Andy Lester [mailto:[EMAIL PROTECTED]
Sent: Friday, July 09, 2004 8:52 PM
To: Brian Volk
Cc: '[EMAIL PROTECTED]'
Subject: Re: LWP::Simple question 


>
> These are links to MSDS (material safety data sheets).  I need to 
> check and
> see which links are broken.   Is this a job for LWP::Simple?  If so can
> someone please explain how I am supposed to load the local file?

It's a job for WWW::Mechanize.

See http://search.cpan.org/dist/WWW-Mechanize/

Check the FAQ and the Cookbook.

xoxo,
Andy


--
Andy Lester => [EMAIL PROTECTED] => www.petdance.com => AIM:petdance

Reply via email to