On Fri, Sep 01, 2006 at 03:56:19PM +0700, Jeroen T. Vermeulen wrote: > That's a very common thing in processor design as well, and there's a > standard trick for it: the saturating two-bit counter. It tends to work > pretty well for branch prediction, value prediction etc. Usually it's the > first thing you reach for, so of course somebody may already have tried it > here and found it didn't work.
Interesting thought. It might be worth trying. But my big question: is all this testing and counting actually going to be faster than just replanning? Postgresql's planner is not that slow. > Of course there's a problem when parameters do not match predicted values. > That's where having one or two backup plans could come in handy. You > could keep your original, fully-generalized plan around. If plans are > cheap enough to store, you could try to keep a cache of old plans for the > same query. The great thing about keeping some backup plans around is > that a pseudo-constant parameter can have a different value once in a > while, then flick back to its old habits without invalidating all your > efforts. Your usually-unused search fields are a good example. You may > also have two stable parameter patterns with different sets of > pseudo-constants competing for your attention. The thing is that number of possible plans is going to be proportional to factorial(number of tables). Once you have 3 tables you're going to have at least a dozen possible plans, probably more. What the best plan is depends strongly on what the parameters are. Anyway, your plan assumes that you have information to work with. The current system plans prepared queries with no information at all about parameters and people are advocating to keep it that way. I think a good first step would be the plan on first execution, like Oracle does. Have a nice day, -- Martijn van Oosterhout <email@example.com> http://svana.org/kleptog/ > From each according to his ability. To each according to his ability to > litigate.
Description: Digital signature