Deviloper wrote:
> I am doing a script which updates data sets of thousands of customers on a 
> regular base.
> (Volume data managemed.) This should be near realtime, so the customers know, 
> how much data they transmitted and how much they have to pay.
> 
> Because of the type of traffic measurement the input comes in blocks of 1-100 
> datasets which has to be updated.
> 
> I found some ideas how to optimize the DB-Updates. 
> (http://www.cwinters.com/programming/dc_pm_dbi.html).
> 
> Because my DB-Transactions are very simple updates / inserts I am thinking 
> about the best way to optimize the process.
> 
> My thoughts:
> 
> Cache all input for a given time and put them in a single statement, which I 
> can execute with one $dbh->do()? 
> 
> or
> 
> Build the standard statement, utilizing placeholders for example:
> 
> my $sql = qq/INSERT INTO Products   ( ProductCode, ProductName, Price )   
> VALUES   ( ?, ?, ? )/;
> 
> 
> $sth $dbh->prepare($sql);
> 
> 
> 
> 
> and then execute every single statement on its own.
> 
> or 
> 
> Keep it simple and do() very dataset on its own to the db. (I guess this is 
> the worst idea.)

First - write something that does what you want as cleanly and intelligibly as
possible.

Second - see if it's already fast enough. If not then profile your code and
start to optimise it.

Rob

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to