"Kevin Grittner" <[email protected]> writes:
> Tom Lane <[email protected]> wrote:
>> I don't have a better idea at the moment :-(
> It's been a while since I've been bitten by this issue -- the last
> time was under Sybase. The Sybase suggestion was to either add
> "dummy rows" [YUCK!] to set the extreme bounds or to "lie to the
> optimizer" by fudging the statistics after each generation. Perhaps
> we could do better by adding columns for high and low bounds to
> pg_statistic. These would not be set by ANALYZE, but
> user-modifiable to cover exactly this problem? NULL would mean
> current behavior?
Well, the problem Josh has got is exactly that a constant high bound
doesn't work.
What I'm wondering about is why he finds that re-running ANALYZE
isn't an acceptable solution. It's supposed to be a reasonably
cheap thing to do.
I think the cleanest solution to this would be to make ANALYZE
cheaper, perhaps by finding some way for it to work incrementally.
regards, tom lane
--
Sent via pgsql-hackers mailing list ([email protected])
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers