> -----Ursprüngliche Nachricht-----
> Von: [email protected] 
> [mailto:[EMAIL PROTECTED] Im Auftrag von Contact 42
> Gesendet: Donnerstag, 12. Juni 2008 04:20
> An: [email protected]
> Betreff: Re: Bulk inserts
> 
> 
> Jonathan Vanasco wrote:
> > write to a .txt log in some sort of standardized format , 
> when it hits 
> > 10k lines run a batch query
> what happens if an insert fails ?

Depending on the locking behaviour of the bulk loader it could
make sense to load the data into "simple" preprocessing tables
which don't have any validation logic or primary key constraints.

From these tables the final processing is then done. This would
move the error handling from the bulk loader into the procedures
inside the database, and solve bulk loading specific issues.

For explanation: in my experience (from the time where Oracle 8
was "the" standard) some bulk loaders are quite ruthless about
resources and connections of other users. They lock the table,
sometimes the database completly, maybe disabling any validation
logic and are doing more or less direct disk writes into the table 
data blocks in the file system.

I've worked for a large telco which did data migration of 
their core BSS systems, moving 750 million records (around
25 mill customers) from one database version to the new
database version.

There are often different bulk loading mechanisms avaiable...

Error handling strongly depends on the method used. 

Just my 2cents,
Andrew



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"pylons-discuss" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/pylons-discuss?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to