On Wed, 10 Jan 2001, Mark Rogaski wrote:

>The danger of relying on Moore's law to overcome computational
>intractablity is that we fail to account for the fact that the inputs are
>increasing at an accelerated rate, too.  I'm not sure if it is fair to say
>that average datasets increase exponentially, but I think it is safe to say
>that as file sizes and available networking bandwidth grow, the bar that
>we've raised with platform optimisation is not as far away as we might have

Computational intractability is one thing, but overall running time is
also a factor. Consider a daily job that takes 25 hours to run- not much
use at the moment, but also not worth optimising if you can hold off for
6 months until it runs within 24 hours on the kit that's now within your
budget. Once it runs within the minimum required runtime, then you're
fine, irrespective of the actual algorithmic complexity. Daft example, I
know, but it does put across the other half of the optimisation

I came across a great little saying about number crunching research:
 If you have a grant for a three year project, then it's more efficient
to spend the first 18 months on holiday in Hawaii, and then come back
and buy the kit to do the crunching.

Not particularly profound, but a fun thought nonetheless. And of course
the fact that the cheapest time to buy a computer is tomorrow.

Mike Wyer <[EMAIL PROTECTED]>     ||         "Woof?"
http://www.doc.ic.ac.uk/~mw     ||  Gaspode the Wonder Dog
Work:  +44 020 7594 8440        ||
Mobile: +44 07879 697119        ||  ICQ: 43922064

Reply via email to