[PERFORM] Functionscan estimates

2005-04-08 Thread Josh Berkus
Folks, I'm wondering if it might be useful to be able to add estimated selectivity to a function definition for purposes of query estimation. Currently function scans automatically return a flat default 1000 estimated rows. It seems like the DBA ought to be able to ALTER FUNCTION and give

Re: [PERFORM] Functionscan estimates

2005-04-08 Thread Michael Fuhr
On Fri, Apr 08, 2005 at 03:15:50PM -0700, Josh Berkus wrote: I'm wondering if it might be useful to be able to add estimated selectivity to a function definition for purposes of query estimation. Currently function scans automatically return a flat default 1000 estimated rows. It seems

Re: [PERFORM] Functionscan estimates

2005-04-08 Thread Alvaro Herrera
On Fri, Apr 08, 2005 at 04:38:20PM -0600, Michael Fuhr wrote: On Fri, Apr 08, 2005 at 03:15:50PM -0700, Josh Berkus wrote: I'm wondering if it might be useful to be able to add estimated selectivity to a function definition for purposes of query estimation. Currently function scans

Re: [PERFORM] Functionscan estimates

2005-04-08 Thread Alvaro Herrera
On Fri, Apr 08, 2005 at 04:04:27PM -0700, Josh Berkus wrote: My solution would be a lot simpler, since we could simply populate pg_proc.proestrows with 1000 by default if not changed by the DBA. In an even better world, we could tie it to a table, saying that, for example, proestrows =

Re: [PERFORM] Functionscan estimates

2005-04-08 Thread Tom Lane
Not too many releases ago, there were several columns in pg_proc that were intended to support estimation of the runtime cost and number of result rows of set-returning functions. I believe in fact that these were the remains of Joe Hellerstein's thesis on expensive-function evaluation, and are