To whom it may concern,

I am a java developer using postgres as a DB server.  Me and my development team have a product comprised of about 50 tables, with about 10,000 records in the largest table.  We have looked for concrete answers in books and the web for solutions to several problems that are plaguing us.  Now we look to the source.

Issue #1 Massive deletion of records.

Q - If many (eg hundreds) records are deleted (purposely), those records get flagged for later removal.  What is the best sequence of operations to optimize the database afterwards?  Is it Vacuum, Re-index, then do a Vacuum Analyze.

Some of what I have read suggests that doing a vacuum without a re-index, can cause a given index to be invalid (ie entries pointing to records that do not match the index criteria).

This would then suggest that doing a Vacuum Analyze would create an incorrect statistics table.

Any help regarding the best maintenance policy, ramifications of mass deletions, vacuuming, and re-indexing would be most appreciated.  Thanks



Do you Yahoo!?
Yahoo! Web Hosting - Let the expert host your site

Reply via email to