Not a lot of details here, Sorry.  But I'm just wondering if this sounds
familiar to anyone.

I've got a spider program that when run on FreeBSD 2.2.8, perl 5.6.1, and 
LWP::UserAgent 1.80 (not sure which LWP bundle that is) it uses a huge
amount of memory after just about 70 documents have been requested:

USER    PID     %CPU    %MEM    VSZ             RSS             TT  STAT... 
root    7246    29.7            56.4            97844   35544   p1  D+     10:14AM     
 0:54.10
        /usr/local/bin/perl -w ./spider.pl cb.pl

They are spidering the web server on the same machine the spider is running
on (spidering a local machine).

If I run the same spider from my linux machine (spidering the same site) it
only uses about 10M after spidering 4000 documents.  I'm running 5.6.0, and
LWP bundle _90.

Spider uses:
use LWP::RobotUA;
use HTML::LinkExtor;

And spiders recursively.

Anything to do with FreeBSD?  Or maybe an older version of LWP?

Any pointers to tracking this down?  I'm helping someone all by email and
I'm not clear how much perl debugging experience they have.



Bill Moseley
mailto:[EMAIL PROTECTED]

Reply via email to