On Mon, Mar 18, 2013 at 4:31 PM, OWP <owpmail...@gmail.com> wrote: > If I may ask, I'm not quite sure what O(2^n) and O(1) are?
Just a metaphor using algorithmic complexity, is all. > I'm curious, were not all these built on the foundation of Moore's > Law? Everything Vigoda lists has Moore's Law in mind. If Moore's Law > were to suddenly disappear, could these survive on their own merit? No one really knows, since Moore's law has operated for so long. There seem to be constant factors to be had in my opinion* but in some problems/areas/domains, ASICs don't seem to help very much**, and we should not forget the colossal investments that a cutting-edge X86 chip fab represents*** which may make the best general bang for buck a commodity processor. For example, for every person who trumpets a 100x gain in switching their program to a GPU, there's another person abandoning their effort because they lose all the speed gains in transferring data back and forth from the GPU. But I think this is getting pretty off-topic for Haskell-cafe. * I'm not an expert, but some useful material is in http://www.gwern.net/Aria%27s%20past,%20present,%20and%20future#fn3 and http://www.gwern.net/Self-decrypting%20files#constant-factors ** the more serial a problem is, the more conditionals, memory accesses, and distinct operations a task requires, the more the optimal processor will... look like a CPU. *** http://www.gwern.net/Slowing%20Moore%27s%20Law#fab-costs-and-requirements -- gwern _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe