On Tue, 05 Feb 2013 13:21:01 -0000, "rudibrazil78" <[email protected]> wrote: > Hi all > > We've been having to deal with a bad situation for quite a few years now - > how to handle large imports without taking the server down (in fact that > data is meant to be acessible as soon as its ready in production). > > This is a call center software which routinely imports lists of clients > and possible clients to contact.
How often do you do this, and isn't it possible to do it in off-hours (eg at night, during lunchbreak)? > The import happens on 2 main tables, clients and calls, which are exactly > the same tables the operators will use/update while working. Are you only adding, or also updating? Do you do everything in a single transaction, or a transaction per individual change, or something in between (eg a commit for every x updates)? > These are very large tables (over 200 fields each), with many fk's (and a > lot of the fk indexes are reportedly bad ones). > > Not only importing is taking very long, but while importing is happening, > performance in production takes a severe hit. What is 'very long' and how many updates are we talking about? > I know theres no magic wand, but I seriously need to start making a > checklist of practices I should be trying out to contour this situation. > > Thanks in advance...please let me know if I can clarify the issue further. You could consider adding a separate staging table where you first load the data into the database in that separate table, and only then (eg using a stored procedure, or using MERGE or UPDATE OR INSERT, etc) do you start loading it into the actual table(s). You might even massage the data before loading it into the final destination (eg remove records that don't need to be updated). This might not necessarily improve things, but I expect it will decrease time to load the data, and also reduce the performance hit for your operators. Mark
