Hello all,

I have been spending the last few weeks trying out different methods for 
loading a batch of around 100,000 records.  I have had varying degrees of 
success with getting the load to "work," including copy level information, but 
the last few loads seem to have somehow developed a search speed problem.  I 
say "developed" because I am pretty sure that it wasn't horribly slow the first 
few times I did the load, but has gotten progressively worse to the point that 
any keyword search which returns several thousand results no longer completes 
in the OPAC, and takes from 45 seconds to several minutes to process in 
postgres.  I am not certain that it has gotten worse, but I am sure that it is 
currently quite bad.

In between each load I have been running build-db.sh, and even dropped and 
recreated my entire evergreen database before a recent load.  Is there 
something else I need to do in order to get an entirely clean slate?  This 
project has been my first experience with postgres, so I may be missing 
something rather obvious.

My server specs are somewhat modest but by no means slow; I am currently 
running a RedHat ES 5 virtual machine on an Opteron 2.4Ghz with 1GB of 
dedicated RAM.

Any suggestions?  Naturally one might suspect an index problem.  Is there any 
way to verify that all relevant indexes are working, and/or to simply rebuild 
them all?

Thanks,
DW



Reply via email to