1. I'd suppose a lot of this advice (e.g. using prepared statements) applies to other/all environments (.Net, classic VB6 etc) 2. So what *are* you using then?
On 7-2-2013 2:06, W O wrote: > Thank you Dmitry, but it seems to me that you think I am using Delphi or > something similar and not, is not the case. > > But your advices are very good, thanks again. > > On Wed, Feb 6, 2013 at 4:40 PM, Dmitry Kuzmenko <[email protected]> wrote: >> The most common solutions to speedup import >> >> 1. do not use cached datasets to read data. The more records you read, >> the more cache allocated in client program, the slower import will be. >> Source dataset must be able to "cache" only one row. >> >> 2. do not commit on each insert. That's the rule. The more commits, >> the slower performance. If you can't control transactions during >> import, you'll fail. >> >> 3. lot of indices on target table slowdowns inserts and updates. Turn >> that indices off (inactive) for import. Or, at least, turn of less >> selective indices. >> >> 4. do not use DataSets for target. Insert must be made by simple query >> "insert ..." and nothing else. >> >> 5. use parameters to insert data. plain text insert with data takes >> more time on prepare/execute cycles, than one prepare, and cycle with >> change_params/execute.
