;ll see if it's possible to do this on AWS PostgreSQL RDS service but
this sort of thing is usually not
On 21 March 2018 at 12:59, Pavel Stehule wrote:
>
>
> 2018-03-21 13:56 GMT+01:00 Pavel Stehule :
>>
>>
>>
>> 2018-03-21 13:03 GMT+01:00 Gary Cowell :
>>
We are trying to implement postgresql code to load a large object into
a postgresql bytea in chunks to avoid loading the file into memory in
the client.
First attempt was to do
update build_attachment set chunk = chunk || newdata ;
this did not scale and got significantly slower after 4000-5000