ed
max user processes (-u) unlimited
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
Regards,
Michael A.
Tom Lane wrote:
Andrew Sullivan <[EMAIL PROTECTED]> writes:
On Tue, Jan 08, 2008 at 05:27:16PM +0100, Michael Akinde wrote:
Tom Lane wrote:
Michael Akinde <[EMAIL PROTECTED]> writes:
$> ulimit -a
core file size (blocks, -c) 1
...
What you're showing us is the conditions that prevail in your
interactive session. That doesn't necessarily have a lot to do with
the ulimit
Tom Lane wrote:
Michael Akinde <[EMAIL PROTECTED]> writes:
INFO: vacuuming "pg_catalog.pg_largeobject"
ERROR: out of memory
DETAIL: Failed on request of size 536870912
Are you sure this is a VACUUM FULL, and not a plain VACUUM?
Very sure.
Ran a VACUUM FULL again yesterday
as it seems, VACUUM FULL
doesn't work for tables beyond a certain size. Assuming we have not set
up something completely wrongly, this seems like a bug.
If this is the wrong mailing list to be posting this, then please let me
know.
Regards,
Michael Akinde
Database Architect, Met.no
that much for a large database. I'd expect an
operation on such a table to take time, of course, but not to
consistently crash out of memory.
Any suggestions as to what we can otherwise try to isolate the problem?
Regards,
Michael Akinde
Database Architect, met.no
Michael Akinde wrote:
[Synopsis: VACUUM FULL ANALYZE goes out of memory on a very large
pg_catalog.pg_largeobject table.]
Simon Riggs wrote:
Can you run ANALYZE and then VACUUM VERBOSE, both on just
pg_largeobject, please? It will be useful to know whether they succeed
ANALYZE:
INFO: analyzing "pg_catalog.pg_larg
ever, the problem also occurred with the shared_buffers limit set at
24 MB and maintenance_work_mem was at its default setting (16 MB?), so I
would be rather surprised if the problem did not repeat itself.
Regards,
Michael Akinde
Database Architect, met.no
begin:vcard
fn:Michael Akinde
n:Akinde;Micha
Stefan Kaltenbrunner wrote:
Michael Akinde wrote:
Incidentally, in the first error of the two I posted, the shared
memory setting was significantly lower (24 MB, I believe). I'll try
with 128 MB before I leave in the evening, though (assuming the other
tests I'm running comple
007-12-11 at 10:59 +0100, Michael Akinde wrote:
I am encountering problems when trying to run VACUUM FULL ANALYZE on a
particular table in my database; namely that the process crashes out
with the following problem:
Probably just as well, since a VACUUM FULL on an 800GB table is going t
of
operation, so we'd really like to ensure that we can get VACUUM working
(although the data is mostly going to be static on this installation, we
will have others that won't be).
Anyone with some insights on VACUUM FULL ANALYZE who can weigh in on
what is going wrong?
Regards,
Michael
10 matches
Mail list logo