On 10/7/2010 14:33, Steven Schveighoffer wrote: > On Thu, 07 Oct 2010 16:23:47 -0400, Rainer Deyke <rain...@eldwood.com> > wrote: >> I can't say I've ever cared about the big-O complexity of an operation. > > Then you don't understand how important it is.
Let me rephrase that. I care about performance. Big-O complexity can obviously have a significant effect on performance, so I so do care about it, but only to the extend that it affects performance. Low big-O complexity is a means to an end, not a goal in and of itself. If 'n' is low enough, then a O(2**n) algorithm may well be faster than an O(1) algorithm. I also believe that, in the absence of a sophisticated system that actually verifies performance guarantees, the language and standard library should trust the programmer to know what he is doing. The standard library should only provide transitive performance guarantees, e.g. this algorithm calls function 'f' 'n' times, so the algorithm's performance is O(n * complexity(f)). If 'f' runs in constant time, the algorithm runs in linear time. If 'f' runs in exponential time, the algorithm still runs. > big O complexity is very important when you are writing libraries. Not > so much when you are writing applications -- if you can live with it in > your application, then fine. But Phobos should not have these problems > for people who *do* care. > > What I'd suggest is to write your own function that uses in when > possible and find when not possible. Then use that in your code. The issue is that algorithms may use 'in' internally, so I may have to rewrite large parts of Phobos. (And the issue isn't about 'in' specifically, but complexity guarantees in general.) -- Rainer Deyke - rain...@eldwood.com