On Tue, May 7, 2013 at 8:30 PM, Tim Tyler <[email protected]> wrote: > > On 06/05/2013 11:09, Matt Mahoney wrote: > > By Moore's Law, the hardware loses half its > value over 2 years, which makes its useful lifetime 4 years. If the > initial cost is $10 million, then figure $200K per month. In 2 years > it will be $100K per month. There is also the cost of electricity, > currently $100 per hour ($72K per month). This will also drop with > Moore's Law, but rise as energy becomes more scarce. > > > What, you mean: as the universal heat death gradually approaches?!?
Energy costs will rise as we run out of oil, coal, gas, and uranium, and we have to switch to solar. It will also rise as population grows and as the economy grows, especially in underdeveloped countries. Moore's Law should drop the power requirement for awhile, but at this point we do not seem to have a path to a 1 petaflop computer that runs on 100 watts. Just shrinking transistors won't do it, because feature sizes are already down to about 100 atoms across. -- -- Matt Mahoney, [email protected] ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
