On 4/28/07, Jeffrey Yasskin <[EMAIL PROTECTED]> wrote: > On 4/25/07, Jim Jewett <[EMAIL PROTECTED]> wrote: > > On 4/25/07, Jeffrey Yasskin <[EMAIL PROTECTED]> wrote: > > > class MonoidUnderPlus(Abstract): > > > > Is this useful? Just because two things are both Monoid instances > > doesn't mean I can add them -- they have to be part of the same > > Monoid. By the time you do > > > > assert isinstance(a, MonoidUnderPlus) > > assert isinstance(b, MonoidUnderPlus) > > assert isinstance(a, b.__class__) or isinstance(b, a.__class__) > > > > I'm not sure how much you've really saved. > > I brought this up on the PEP 3119 thread, as an issue for > TotallyOrdered too. The exact syntax for checking that two objects are > "in the same instance of ABC X" should be decided there. > > As to the benefits of MonoidUnderPlus over just hasattr(a, > '__plus__'), by knowing that the operation is associative, a function > may be able to speed itself up. A somewhat academic example is a > SortedList type: > class SortedList(MonoidUnderPlus): > def __init__(self, iterable): self.contents = sorted(iterable) > def __add__(lhs, rhs): return merge(lhs, rhs) > Then (ignoring that sum currently requires numbers) we can define mysorted() > as: > def mysorted(iterable): return sum(map(SortedList, iterable)).contents > With sum() implemented as a simple for loop, this is insertion sort, > taking O(n^2) time. However, by noticing that + is associative on its > argument, sum can be defined (assume the annotation implies the > appropriate checking) roughly as: > def sum(lst : SequenceOfMonoidUnderPlus): > if len(lst) == 1: return lst[0] > else: return sum(lst[:len(lst)//2]) + sum(lst[len(lst)//2+1:] > and we've transformed the algorithm into merge sort, taking O(n log n) > time. (Hm, the zero element isn't as useful here as in other languages > since you can't get a type without an instance. Maybe it should go > leave the interface.) > > This optimization is, of course, incorrect if + is not associative.
Unfortunately, such a dramatic performance difference will mean people will depend on it, then be surprised when it occasionally fails. Since the original O(n**2) complexity is now perceived as a failure mode, it's best to split it into an explicit method. Python does have many optimizations, but they're well hidden, and rarely if ever result in a noticeable complexity difference. -- Adam Olsen, aka Rhamphoryncus _______________________________________________ Python-3000 mailing list [email protected] http://mail.python.org/mailman/listinfo/python-3000 Unsubscribe: http://mail.python.org/mailman/options/python-3000/archive%40mail-archive.com
