Sebastien Boisvert writes:
> - Original Message
>> From: Tom Lane
>>> [ COPY fails to dump a 138MB bytea column ]
>> I wonder whether you are doing anything that exacerbates
>> the memory requirement, for instance by forcing an encoding conversion to
>> something other than the databa
- Original Message
> From: Tom Lane
>> [ COPY fails to dump a 138MB bytea column ]
> I wonder whether you are doing anything that exacerbates
> the memory requirement, for instance by forcing an encoding conversion to
> something other than the database's server_encoding.
Our backups
Sebastien Boisvert writes:
> [ COPY fails to dump a 138MB bytea column ]
If you can't switch to a 64-bit build of Postgres, you might need to
think about converting those byteas to large objects. It's expected for
COPY to require memory space equal to several times the width of the row
it's tryi
Hi all,
We have an OS X app which integrates postgres as its database backend, and
recently we've have a couple of cases where users haven't been able to perform
a backup of their database. The failure gets reported as a problem in a table
("largedata") where we store large binary objects, wi