On 3/14/2011 10:21 AM, Gerald Britton wrote:

Any idea why Python works this way?  I see that, in 3.2, an
optimization was done for sets (See "Optimizations" at
http://docs.python.org/py3k/whatsnew/3.2.html) though I do not see
anything similar for dictionaries.


1/ because no one would ever see the difference.

The same thing could be said about sets, yet a similar optimization
was added to 3.2

For one and only one context.

2/ immutables can always be evaluated before any high CPU consuming loop

immutables could also be evaluated at compile time, which would
obviate any such concern.

3/ it would make the implementation more complex (i.e. more work for our
beloved active community) for no gain

See my reply to 1/ above.

You are missing the point. *Every* optimization has a cost for *everyone* (the test to see whether it should be applied or not. It has a gain for a few -- those for whom the test is true. Do those gains outweigh the cost?

In the particular case of 'ob in constant_set', people were avoiding that and instead writing 'ob in constant_tuple' because of the cost of recreating the set (but not the tuple) each time the expression is evaluated. Note that frozenset(constant_set) is even worse because *that* is not optimized either. If the set is very big, the hash lookup is faster than a linear scan of a tuple. So the cost-benefit decision was 'yes'.

In your example
  "one:%(one)s two:%(two)s" % {"one": "is the loneliest number",
  "two":"can be as bad as one"}
can be instead put into the code as
   'one:is the loneliest number two:can be as bad as one'
which is easier to write and read. So there is little need if any to write the expression. Hence little benefit and the cost-benefit decision for most will be 'no'.

The other thing you are missing is that the peephole optimizer is hard code and easily broken when changed. The folding of '-' and int literals somehow got disabled some time ago and was just recently fixed again. Even worse are changes that produce bugs, as seems to happen regularly with more agressive optimizers. Attempts to compile CPython with gcc with all 'optimizations' turned on has demonstrated that well enough. The prime author of Python's bytecode peephole optimizer, Raymond H., would like to replace some or all of it with an ast optimizer, rather than expand it.

Perhaps if and when that is done, someone will devise some broader but guaranteed safe detect and replace rules.

--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to