Hey

I'm running on a virtual server, so I have limited memory access... this shouldn't be 
a problem but i'm getting the nasty message "Out Of Memory!". I have had this problem 
before so i know i can't absorbe too much memory. That's why i'm very carefull with 
this and i'm trying to write my program as neat as possible, but it seems like i'm 
missing something here...

What I am trying to do is to fetch data from one table, make a summary of this data 
and write it to another table in the same database. The problem is that the data i'm 
fetching are about 200.000 rows with 10 columns or so (not much data though...  some 
numbers, 2x char(4) and 3x char(255).
  $query_templog = $DB->prepare("SELECT * FROM stat_templog WHERE rec_date < 
CURRENT_DATE()");
  $query_templog->execute();

  while( $hash_templog = $query_templog->fetchrow_hashref('NAME_lc') ) {
    ### doing nothing
  }
In the example above it SHOULD fetch data of 1 row, create a hash for it and return 
the reference. I thought this would be the most efficient way of fetching data but 
when running this script (without doing anything with the fetched data) it consumes 
>30Meg of RAM !!!   I've read the Programming the Perl DBI book from O'Reilly and if I 
got it right it shouldn't do this ! It should replace the fetched columns with the 
newly fetched data. But it's clearly NOT doing this.

In this loop i have to calculate a lot and write this data into 8 different tables. So 
when i'm using up all of the memory by only fetching this data i can forget about 
inserting this into other tables !

When inserting this data into other tables i always have to check if it exists or 
not,... if it does it has to update the value. After everyone of these checks i'm 
clearing the handle with finish(). But it doesn't seem to make any difference :o(

Please help me out,... it's urgent :'-(

Groetjes,
Tiele Declercq [ [EMAIL PROTECTED] ]

---

Projectleider Start.be
http://start.be

Reply via email to