On Friday, 4 December 2015 at 07:50:19 UTC, Minas Mina wrote:
On Friday, 4 December 2015 at 03:46:39 UTC, Walter Bright wrote:
On 12/3/2015 5:27 PM, Andrei Alexandrescu wrote:
Now this primitive may have three complexities:

* linear in the length of r (e.g. c is a singly-linked list)

* linear in the number of elements after r in the collection (e.g. c is an array)

* constant (c is a doubly-linked list)

These complexities must be reflected in the name of the primitives. Or perhaps it's okay to conflate a couple of them. I know names are something we're awfully
good at discussing :o). Destroy!

Are you sure there are only 3 complexities? What if it's a self-balancing tree?

I'm also not sure it is a good idea to put the complexity in the name of the primitive. Do any other algorithm libraries do that?

I agree -- putting the complexity (eh, running time) to the primitive's name seems like a bad idea.

I believe that stability is a more important "feature" of a sorting algorithm than its running time, because you can make implications about it and use them for your own algorithms (I use it for implementing delta compression) and affects the output.

Running time is still a good guarantee to have, but if `sort` runs O(n^2) or O(nlogn) is not going to affect how your output will look like!

That's wrong. Imagine a sort method that calls to an API server to sort the numbers. Given a good internet connection, this implementation detail will not affect the correctness of the result. But I'm sure that no sane person would want to use this method. When the only job of a function is to implement an algorithm then the only thing you should care about is the time and space complexity (the correctness is given). Otherwise, designing large systems becomes impossible, because all large systems have hard performance requirements.

Reply via email to