Hi Neil - Thanks for the advice. First I tried changing my file extension from plx to pl and now at least perliis doesn't hang. But I was still getting a CGI timeout error. So, I tried your advice and ran the link linkchecker from the command line and generated html. No problems. Now I want to call that script via exec() or system() from my CGI script but I get "remote procedure call failed" everytime. I've never used exec or system from within a CGI script before. Is it allowed?

   - Dorian

----- Original Message ----- From: "Neil Burnett" <[EMAIL PROTECTED]>
To: "'dorian'" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Thursday, March 03, 2005 10:01 PM
Subject: RE: linkchecker problem



-----Original Message-----
From: [EMAIL PROTECTED] [mailto:perl-win32-
[EMAIL PROTECTED] On Behalf Of dorian
Sent: Friday, March 04, 2005 12:56 AM
To: [EMAIL PROTECTED]
Cc: dorian
Subject: linkchecker problem

Hi - I'm trying to write a simple link checker on Win2000. Simple as in
loop thru a list of URLs and check the response code. I tried to
implement this using LWP::Simple (as I have once before on UNIX) but it
crashes perl. When I say "crashes perl" I mean I get the dreaded "script
produced no output" error from all the CGI scripts on the server. I then
restart the WWW service under Windows services and everything is OK, until
I run my script. And I've tested this on two different servers BTW, both
Win2000. So I tried using HTTP::SimpleLinkChecker but I get the same
behavior, it crashes perl. It's intermittent; I can run the script once
or twice successfully, on a list of about 50 URLs, but it will "crash
perl" the next time I run the script. I've looked in the Perl error log
but I don't see errors. I've googled and searched the net to no avail.
Does anyone have a clue as to what could be causing this?


thanks in advance - Dorian Winterfeld
[EMAIL PROTECTED]

Probably. Bad links sometime take a long time to return a response. This could be a timeout problem.


I would suggest not using the web server for checking the links, but split the job into 3 parts:

1 create a list of links to check in a database or file
2 run your checker from the command line, reading in the list and updating it with the response codes returned
3 create a web page to report the results if you wish


My link checker uses LWT::Agent, and checks 4000+ links. It checks around 200 bad links wit a Scheduled Task and takes around 5 minutes with a multi-threaded version of 2 above

Or buy a commercial linkchecker:-)


_______________________________________________ Perl-Win32-Web mailing list Perl-Win32-Web@listserv.ActiveState.com To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs




_______________________________________________
Perl-Win32-Web mailing list
Perl-Win32-Web@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to