I have a DB taking up 220gb of a 300gb drive so I purged some old data from it. When I try to vaccum/analyze using pgAdmin III I keep getting an out of memory error. I am running Windows Vista 64 bit with 8gb of RAM and I have change the postgresql.conf file variables to the following:

#custom_variable_classes = ''        # list of custom variable class names
shared_buffers = 1024MB
effective_cache_size = 2048MB
work_mem = 64MB
maintenance_work_mem = 1024MB
commit_delay = 10000
checkpoint_segments = 128
checkpoint_completion_target = 0.9
wal_buffers = 2MB

Any ideas on how I can avoid running out of memory and run a successful vacuum/analyze?

Thanks,
Ben

--
Sent via pgadmin-support mailing list (pgadmin-support@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgadmin-support

Reply via email to