On Nov 3, 2003, at 11:16 PM, Nick Arnett wrote:
[EMAIL PROTECTED] wrote:
I've created a robot, www.dead-links.com and i wonder if this list is
alive.
It is alive, but very, very quiet.
Yeah, this robots thing is just a fad, it'll never catch on. -Tim
Sean M. Burke wrote:
In short, if people want to see improvements to LWP, email me and say what
you want done
For robots, you need a call that says fetch this URL, but get a maximum
of XX bytes and spend a maximum of YY seconds doing it. Return status
should tell you whether it finished
Sean M. Burke wrote:
I'm a bit perplexed over whether the current Perl library WWW::RobotRules
implements a certain part of the Robots Exclusion Standard correctly. So
forgive me if this seems a simple question, but my reading of the Robots
Exclusion Standard hasn't really cleared it up
At 09:47 AM 14/03/02 -0800, srinivas mohan wrote:
Now as the performance is low..we wanted to redevelop
our spider..in a language like c or perl...and use
it with our existing product..
I will be thankful if any one can help me choosing
the better language..where i can get better
At 10:36 AM 14/03/02 -0800, Nick Arnett wrote:
I wish
I could be more specific, but I never did figure out what was really going
on. Following an LWP request through the debugger is a long and convoluted
journey...
I totally agree with Nick that when LWP works, it's OK, but when
it doesn't,