On 10/10/2010 02:05, Jonathan M Davis wrote:
On Saturday 09 October 2010 06:54:31 Stewart Gordon wrote:
<snip>
Surely, what matters most for generic code is that in has consistent
_semantics_, rather than the computational complexity?
Stewart.
_Both_ matter.
True, yet you don't address semantic differences at all in what follows,
so it doesn't really follow on from what I said. But anyway....
Imagine, for instance, if we were to make Java's mistake of having List be an
interface which ArrayList and LinkedList implement and that List has a get()
function. Using ArrayList, get() has a complexity of O(1). Using LinkedList, it
has a complexity of O(n).
Are you suggesting we just don't have a List interface, just to prevent
people from writing algorithms that work well in some list
implementations but poorly in others? I'm not sure if that risk
outweighs any benefits....
Now, imagine code written with List rather than
ArrayList or LinkedList directly. If you use get(), then your code will vary
considerably in how efficient is. In small cases, this won't matter, but with
large lists, it could matter a lot. Your generic code which uses List _has_ to
worry about the computational complexity of get() or it could become very
inefficient. In this case, for most situations, that would mean using a foreach
loop or iterators directly, but using get() is likely a _bad_ idea. You need to
know the computational complexity of get() in order to use List efficiently.
Indeed. But what is an efficient algorithm depends on how the data
structure works in the first place, not just the computational
complexity of particular operations. It is perhaps for this reason that
it is foolish to try to write such generic algorithms.
An example of this is Collections.sort(List) in Java, which gets around
it by dumping the list into an array, an operation which itself adds
overhead. Better would be to have sort as a method of List, so that for
each list class an efficient implementation of sort can be provided.
To write efficient algorithms, you _must_ be able to rely on certain complexity
guarantees for the operations that you use. So, regardless of what those
complexity gurantees actually _are_, they need to be there.
Setting the complexity guarantee of everything to O(n!) would achieve
_that_ aim in most everyday cases.... :)
Stewart.