A couple months back I wrote: > "Eric Thompson" <eric.thomp...@salliemae.com> writes: >> [ http://archives.postgresql.org//pgsql-bugs/2009-03/msg00116.php ]
> Hmm. Tracing through this, it seems your child tables have exactly 101 > separate constraint clauses; removing one from the parent table gets it > down to 100. Which is where the cutoff installed by this patch is: > http://archives.postgresql.org/pgsql-committers/2008-11/msg00146.php > That patch was in response to this complaint: > http://archives.postgresql.org/pgsql-general/2008-11/msg00446.php > I'm not entirely sure about a better approach; just moving the cutoff > around doesn't seem like it will do anything except change who's > complaining... I spent some more time chewing on this. I still don't see a good solution to detect and avoid the repeated determinations that "x <> const1" can't refute "x <> const2". It seems possible to do but it would uglify the code in predtest.c tremendously. The best thing I can think of that seems practical (especially as a back-patch) is just to rejigger the complexity cutoff installed by the above-mentioned patch. Since the cases that are known to be objectionable involve long IN-lists (ie, ScalarArrayOpExpr with a large array), what I'm considering doing is reverting the patch for all but that case --- ie, have a complexity limit only for ScalarArrayOpExpr. Comments? regards, tom lane -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers