Dan Anderson wrote:


Very true. But you also need to look at what you're doing. A spider that indexes or coallates pages across several sites might need to slurp up a large number of pages -- which even at a few kilobytes a piece would be costly on system resources.


Ironically this is the one time I could see slurping as not working too. Has anyone hit the limit for open file descriptors? I know it is OS dependent and pretty damn high, but on a nice enough system with nice enough app I suspect you could hit it. At that point you would have to slurp or close a filehandle...


Dan always comes up with the good discussion topics ;-)...

http://danconia.org

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>




Reply via email to