Correcting myself:
Dimitrios Apostolou wrote:
> Hi,
>
> I just dug into the source code looking for complexity of set
> operations. In the wiki page I documented an interesting finding, that
> it is different to do s-t and s.difference(t). It is also interesting
it is different to do s-t than
Hi,
I just dug into the source code looking for complexity of set
operations. In the wiki page I documented an interesting finding, that
it is different to do s-t and s.difference(t). It is also interesting
that you can do the first only for sets, but the second for every
iterable in t.
Are t
On Thu, Mar 13, 2008 at 3:16 AM, Dimitrios Apostolou <[EMAIL PROTECTED]> wrote:
> On another note which sorting algorithm is python using? Perhaps we can
> add this as a footnote. I always thought it was quicksort, with a worst
> case of O(n^2).
It's a highly optimized variant of mergesort, wit
Dimitrios Apostolou <[EMAIL PROTECTED]> wrote:
> On another note which sorting algorithm is python using? Perhaps we can
> add this as a footnote. I always thought it was quicksort, with a worst
> case of O(n^2).
See http://svn.python.org/projects/python/trunk/Objects/listsort.txt
Daniel Stutzbach wrote:
> On Wed, Mar 12, 2008 at 2:52 PM, Dimitrios Apostolou <[EMAIL PROTECTED]>
> wrote:
>> Just one quick note. What exactly do you mean by "Amortized worst case"?
>> Shouldn't it just be "Worst case"? I think that the word "amortized"
>> better describes the time complexity
On Wed, Mar 12, 2008 at 2:52 PM, Dimitrios Apostolou <[EMAIL PROTECTED]> wrote:
> Just one quick note. What exactly do you mean by "Amortized worst case"?
> Shouldn't it just be "Worst case"? I think that the word "amortized"
> better describes the time complexity of specific operations.
http:/
Daniel Stutzbach wrote:
> I just created a very basic one at
> http://wiki.python.org/moin/TimeComplexity?action=show
Hi,
Just one quick note. What exactly do you mean by "Amortized worst case"?
Shouldn't it just be "Worst case"? I think that the word "amortized"
better describes the time compl
On Mon, Mar 10, 2008 at 12:05 PM, Daniel Stutzbach
<[EMAIL PROTECTED]> wrote:
> On Sun, Mar 9, 2008 at 9:22 AM, Aahz <[EMAIL PROTECTED]> wrote:
> > There probably would be some value in a wiki page on python.org that
> > provides this information, particularly across versions. You may be
> >
>> I follow the advice Guido gave: I use the data
>> structure that will make my code shortest and easiest
>> to read,
>
> But given a choice such as list vs. dict, the code
> is almost identical either way. What do you do then?
Why do you say that? Lists and dictionaries are *completely*
differe
Martin v. Löwis wrote:
> I follow the advice Guido gave: I use the data
> structure that will make my code shortest and easiest
> to read,
But given a choice such as list vs. dict, the code
is almost identical either way. What do you do then?
Toss a coin? Or make a few educated guesses based on
w
On Wed, Mar 12, 2008, Stephen J. Turnbull wrote:
> "Martin v. L?wis" writes:
>>
>> Premature optimization is the root of all evil.
>
> Actually, Knuth's bon mot was "Premature optimization is the root of
> all error."
>From my .sig database:
"Premature optimization is the root of all evil in pr
"Martin v. Löwis" writes:
> Premature optimization is the root of all evil.
Actually, Knuth's bon mot was "Premature optimization is the root of
all error."
Which is probably worse; it leaves the perpetrator the excuse "but I
was only trying to help!" While we all know what to do to evildoers,
> Can you really say that you don't make any design
> decisions early on based on the assumption that
> dict lookup will almost certainly be a lot faster
> than searching a list?
I follow the advice Guido gave: I use the data
structure that will make my code shortest and easiest
to read, regardles
Daniel Stutzbach wrote:
> On Sun, Mar 9, 2008 at 9:22 AM, Aahz <[EMAIL PROTECTED]> wrote:
>> There probably would be some value in a wiki page on python.org that
>> provides this information, particularly across versions. You may be
>> able to find volunteers to help on comp.lang.python.
>
> I
"Greg Ewing" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
|| Yeah, there's no substitute for having at least a
| rough idea of how the object you're using is implemented,
| e.g. array vs. linked list.
As I understand it, the new 2.6 docs include a new one on CPython
specifically.
Dimitrios Apostolou wrote:
> On the other hand notes could be added for various oddities according to
> experience. For example that for sets up to 10(?) elements, a list would
> probably be better because of the hashing overhead.
That's the sort of thing that tends to be *very*
implementation
Martin v. Löwis wrote:
> I would still debate the complexity of dict lookup. What are you
> averaging over?
All the Python programs I've ever run. :-)
> I also disagree that the average complexity is "most important".
> I find the worst-case complexity most important.
While in theory the worst
Martin v. Löwis wrote:
> Yes - the assumption is that more del calls will follow, so that the
> dictionary eventually ends up empty. Only when new inserts are made,
> that assumption is proven wrong, and the shrinking can be done in
> one sweep.
Hmmm. So the most efficient way to copy a dict that
> I assume there is a reason that PyDict_DelItem never calls dictresize?
Yes - the assumption is that more del calls will follow, so that the
dictionary eventually ends up empty. Only when new inserts are made,
that assumption is proven wrong, and the shrinking can be done in
one sweep.
Regards,
On Mon, Mar 10, 2008 at 5:06 PM, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:
> > I just created a very basic one at
> > http://wiki.python.org/moin/TimeComplexity?action=show
>
> I just knew that this could be a field of endless nitpicking.
Certainly. I am hoping that this thread will soon win
> I just created a very basic one at
> http://wiki.python.org/moin/TimeComplexity?action=show
I just knew that this could be a field of endless nitpicking.
I think the dict.copy classification is incorrect. A dict.copy
can take unbounded time, if the dict has seen many recent
deletions (which did
On Mon, Mar 10, 2008, Daniel Stutzbach wrote:
> On Sun, Mar 9, 2008 at 9:22 AM, Aahz <[EMAIL PROTECTED]> wrote:
>>
>> There probably would be some value in a wiki page on python.org that
>> provides this information, particularly across versions. You may be
>> able to find volunteers to help on
On Sun, Mar 9, 2008 at 9:22 AM, Aahz <[EMAIL PROTECTED]> wrote:
> There probably would be some value in a wiki page on python.org that
> provides this information, particularly across versions. You may be
> able to find volunteers to help on comp.lang.python.
I just created a very basic one at
> I will open this can and say that average case complexity is the most
> important. For example sorting O(nlogn) and dict lookup O(1).
I would still debate the complexity of dict lookup. What are you
averaging over? In absence of any distribution property of hash
functions in the average case,
Fred Drake wrote:
> On Mar 9, 2008, at 10:22 AM, Aahz wrote:
>> This has been discussed before and rejected for two reasons:
>>
>> * Other Python implementations (Jython, PyPy, IronPython) may not be
>> able to provide the same type implementations
>>
>> * Algorithmic information does sometimes cha
Hello again,
Guido van Rossum wrote:
> Well, there you have hit the nail on the head -- should we document
> the actual or the guaranteed O() expression? I think this is a can of
> worms better left unopened. At best we should include some hints to
I will open this can and say that average case c
> Dict access should probably be documented as no worse
> than O(log n) to allow for tree implementations.
That should not be documented. The current dict implementation
may use O(n) for lookup operations, where n is the number of
keys in the dictionary, and counting comparison operations.
Regard
On Sun, Mar 9, 2008 at 4:43 PM, Guido van Rossum <[EMAIL PROTECTED]> wrote:
> Well, there you have hit the nail on the head -- should we document
> the actual or the guaranteed O() expression?
Either. Both. Put a note at the bottom saying that factors of O(log
n) have been dropped and they're
On Sun, Mar 9, 2008 at 11:50 AM, Greg Ewing <[EMAIL PROTECTED]> wrote:
> Aahz wrote:
> > * Other Python implementations (Jython, PyPy, IronPython) may not be
> > able to provide the same type implementations
> >
> > * Algorithmic information does sometimes change between versions, and
> > keep
Aahz wrote:
> * Other Python implementations (Jython, PyPy, IronPython) may not be
> able to provide the same type implementations
>
> * Algorithmic information does sometimes change between versions, and
> keeping the docs updated is not trivial
Still, I think there are some general expectations
On Mar 9, 2008, at 10:22 AM, Aahz wrote:
> This has been discussed before and rejected for two reasons:
>
> * Other Python implementations (Jython, PyPy, IronPython) may not be
> able to provide the same type implementations
>
> * Algorithmic information does sometimes change between versions, and
On Sun, Mar 09, 2008, Dimitrios Apostolou wrote:
>
> Is it possible to include algorithm complexity information for the various
> built-in types (strings, sets, lists, dictionaries...)? This would ease
> the decision for choosing the correct type.
This has been discussed before and rejected fo
Hello all,
Is it possible to include algorithm complexity information for the various
built-in types (strings, sets, lists, dictionaries...)? This would ease
the decision for choosing the correct type. The information could simply
be added as a new column in the tables found on pages as the fol
33 matches
Mail list logo