I'm sure this comes up frequently, so I apoligize...

I am designing a system to process almost 4000 remote sites in a nightly
sweep.  This process is controlled from a database which maintains site
status in realtime (or at least that's the goal).   I am attempting to fork
off around 100 "drones" to process 100 concurrent sites.  Each drone will
need a connection to the database.  In doing some impromptu testing I've
had the following results...

Writing a queen, who does nothing, and sets nothing (no variables or
filehandles are open)  except fork off drones, and writing a drone who only
connects to the database and nothing else had the following results on this
machine config:

RedHat Linux 6.2, PIII 600, 256MB RAM, 256MB Swap - Anything more than 55
drones and the system is entirely out of memory.

Is Perl really using that much memory?  There is approx 190MB of RAM free,
and nearly ALL the swap space free when I kick off the Queen process.

Do there results seem typical?  Is there any way to optimize this?

Thanks

Chuck



_______________________________________________
Perl-Unix-Users mailing list. To unsubscribe go to 
http://listserv.ActiveState.com/mailman/subscribe/perl-unix-users

Reply via email to