On 2/28/22 18:21, Tom Lane wrote:
> Consider this admittedly-rather-contrived example:
>
> regression=# create table foo(f1 int);
> CREATE TABLE
> regression=# alter table foo add column bar text default repeat('xyzzy', 
> 1000000);
> ERROR:  row is too big: size 57416, maximum size 8160
>
> Since the table contains no rows at all, this is a surprising
> failure.  The reason for it of course is that pg_attribute
> has no TOAST table, so it can't store indefinitely large
> attmissingval fields.
>
> I think the simplest answer, and likely the only feasible one for
> the back branches, is to disable the attmissingval optimization
> if the proposed value is "too large".  Not sure exactly where the
> threshold for that ought to be, but maybe BLCKSZ/8 could be a
> starting offer.
>
>                       


WFM. After all, it's taken several years for this to surface. Is it
based on actual field experience?


cheers


andrew


--
Andrew Dunstan
EDB: https://www.enterprisedb.com



Reply via email to