--- In [email protected], "un_spoken"  wrote:
>
> 
> 
> --- In [email protected], "rudibrazil78"  wrote:
> >
> > Hi all
> > 
> > We've been having to deal with a bad situation for quite a few years now - 
> > how to handle large imports without taking the server down (in fact that 
> > data is meant to be acessible as soon as its ready in production).
> 
> We have the same problem. We are importing daily like 1000000 rows. There are 
> 100000 new rows each day and the rest are the rows which are updating the 
> ones which are already in the database. Import takes about an hour for 100k 
> rows. During the import many things can go wrong..
> 
> I would like to hear your strategies for this sort of problem:)
>


Hi,

i import 300 000 000 rows each day into my db by merge statement
and i use for import external tables.

Whole import took 2 hours - and performance degradation is acceptable.
With 1000000 rows and external table you can finish in few minutes e.g. 3 
minutes. External table is solution for all imports.

regards,
Karol Bieniaszewski

Reply via email to