It looks to me like you need to simply do your first fetch and get all 
results for that one and store it in a persistent store somewhere, using 
Cache::Cache or MLDBM::Sync.

Then, you can make some efforts at getting the rest of your data before 
DBA shutdown and simply resume where you left off.

The key is using some sort of persistence mechanism as discussed above.


On Monday, October 29, 2001, at 12:13 PM, Shaozab, Sumera wrote:

> Thank you Terrence for pointing this out to me!
>
> I guess I am so frustrated in trying to make my program  more efficient.
> The DBA turns off all connection every night for backups so I have to 
> find a
> way to finish up my program before the nightly backup...So far my 
> program
> has been running for more than 12 hours until disconnected. And I am 
> only
> retrieving about 3 million records! Any suggestions on how I can make my
> program more efficient?
>
> This is the way I am doing it using fetchrow_arrayref;
>
> While (fetch){
>       execute
>       fetch   # on table 2
>               execute
>               fetch   # on table 3
>                       execute
>                       fetch   # on table 4
>                               execute
>                               fetch   on table 5
>
> }
>
> Each query is depended on the value from the value of the first fetch.
>
> Sumera
>> ----------
>> From:        Terrence Brannon[SMTP:[EMAIL PROTECTED]]
>> Sent:        Monday, October 29, 2001 2:09 PM
>> To:  [EMAIL PROTECTED]
>> Subject:     Re: Batch Fetching -> Out of memory Error
>>
>>
>> On Monday, October 29, 2001, at 10:08 AM, Wilson, Doug wrote:
>>
>>>
>>>> From: Shaozab, Sumera [mailto:[EMAIL PROTECTED]]
>>>
>>>> tried, I got "Out of memory!" error.  The number of records I
>>>> am retrieving
>>>> is about 3 million records.  Is this too much for 
>>>> fetchall_arrayref()?
>>>> Should I upgrade DBI and DBD::Oracle? Any recommendation in
>>>> speeding up my
>>>> program?
>>>
>>> speed up? Why are you worried about speeding up your program; you
>>> should first worry about making it work. Upgrading a module isn't
>>> going to give you more memory. Only getting more memory is going
>>> to get you more memory. Or using less memory would be even
>>> better. DON'T fetch every thing into memory. Use one of the 
>>> 'fetchrow_*'
>>> methods instead of a 'fetchall_*' method and process one row at
>>> a time.
>>>
>>
>> Sumera, you seem to think that running fetchall_arrayref will make your
>> program run faster than fetchrow_arrayref... but neither is going to
>> make the retrieval of data from the database any faster. In fact, if 
>> you
>> take a look at the DBI source code, you will see that fetchall_arrayref
>> is implemented with fetchrow_arrayref. It really just makes things more
>> convenient, not faster.
>>

Reply via email to