An entity claiming to be Greg McCarroll ([EMAIL PROTECTED]) wrote:
: 
: ok, but it gets more interesting as take into account moores law that 
: reduces the effectiveness of optmisation by halving the improvement
: of the optimization every year, this of course then compares to programmers
: 

I'm not sure if this is necessarily the case.  First, complexity is
independent of platform.  A sort routine with a worst case of O(n log n)
still has a worst case of O(n log n) on a faster platform.  Complexity
measures an abstract and arbitrary cost, which could be the number of
comparisons or assignments or both, etc.  It does not measure true running
time (or space necessarily), by speeding up the platform, you have not
optimised the algorithm.  The important thing to remember about
computational complexity is that we are dealing with asymptotic behavior,
the true impact of design decisions affect us as n becomes very large.  By
speeding up the platform, we merely raise the bar for what we consider
"very large".

The danger of relying on Moore's law to overcome computational
intractablity is that we fail to account for the fact that the inputs are
increasing at an accelerated rate, too.  I'm not sure if it is fair to say
that average datasets increase exponentially, but I think it is safe to say
that as file sizes and available networking bandwidth grow, the bar that
we've raised with platform optimisation is not as far away as we might have
thought.

Mark

-- 
Mark Rogaski                  | "I've said this before but I'll say it again:
[EMAIL PROTECTED]             |     Smashing Pumpkins IS REO Speedwagon."
http://www.pobox.com/~wendigo |                      -- Steve Albini
__END__                       |

PGP signature

Reply via email to