2008/8/6 Brandon W. Uhlman <[EMAIL PROTECTED]>:
      
I have about 960 000 bibliographic records I need to import into an
Evergreen system. The database server is dual quad-core Xeons with 24GB
of
RAM.

Currently, I've split the bibliographic records into 8 batches of ~120K
records each, did the marc_bre/direct_ingest/parellel_pg_loader dance,
but
one of those files has been chugging along in psql now for more than 16
hours. How long should I expect these files to take? Would more smaller
files load more quickly in terms of total time for the same full
recordset?

        
Just for what it's worth. My experience is that given a large number of bibs to import, more smaller batches complete faster than few larger batches.

John

Reply via email to