Hi folks,
I'm making a small system using perl and PostgresSQL, but have some problem 
when clients access tables with abou 2000 rows.
Well, I have a piece of code where I list all the data that comes from one 
table (the one with 2000 rows) and the perl process becomes a fat (13Mgs in 
RAM) and slow (can take about 7 minutes) to show the results.

the program is not complicated, it's as simple as:

$query="select * from table";
$sth=$dbh->prepare($query);
$sth->execute();
while(@data=$sth->fetchrow_array){
   print "<tr><td>$data[0]</td><td>$data[1]</td></tr>";
}


I'm runing a RedHat 7.1 with a 2.4.2 SMP kernel on a dual Pentium III, 256M 
RAM pc.
The clients are windows, but I have monitores the processes on a terminal 
and saw them fat, using quite some CPU but slow when giving data to the 
clients.
The networks is not charged, I have transmited a 3Mg file in less than a 
second.

Thanks in advance,
Carlos L�pez Linares
_________________________________________________________________________
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.

Reply via email to