[
https://issues.apache.org/jira/browse/DERBY-1908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Rick Hillegas closed DERBY-1908.
--------------------------------
Resolution: Won't Fix
It looks to me as though the question was answered.
> Investigate: What's the "unit" for optimizer cost estimates?
> ------------------------------------------------------------
>
> Key: DERBY-1908
> URL: https://issues.apache.org/jira/browse/DERBY-1908
> Project: Derby
> Issue Type: Task
> Components: SQL
> Reporter: A B
>
> Derby optimizer decisions are necessarily based on cost estimates. But what
> are "units" for these cost estimates? There is logic in
> OptimizerImpl.getNextPermutation() that treats cost estimates as if their
> unit is milliseconds--but is that really the case?
> The answer to that question may in fact be "Yes, the units are
> milliseconds"--and maybe the unexpected cost estimates that are sometimes
> seen are really caused by something else (ex. DERBY-1905). But if that's the
> case, it would be great to look at the optimizer costing code (see esp.
> FromBaseTable.estimateCost()) to verify that all of the "magic" of costing
> really makes sense given that the underlying unit is supposed to be
> milliseconds.
> Also, if the stats/cost estimate calculations are truly meant to be in terms
> of milliseconds, I can't help but wonder on what machine/criteria the
> determination of milliseconds is based. Is it time to update the stats for
> "modern" machines, or perhaps (shooting for the sky) to dynamically adjust
> the millisecond stats based on the machine that's running Derby and use the
> adjusted values somehow? I have no answers to these questions, but I think
> it would be great if someone out there was inclined to discuss/investigate
> these kinds of questions a bit more...
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.