I'm working on a revision of my paper on the cost of AI.
https://docs.google.com/document/d/1Z0kr3XDoM6cr5TgHH0GXQTjyikr7WpCkpWFn9IglW3o/edit?usp=sharing

Some highlights of the changes so far.

1. Power requirements. A human brain sized neural network running at
10^16 OPS on a supercomputer typically uses 10 MW of power at a cost
of $1000 per hour for electricity. This is 4 x 10^5 times more than
the human brain, which is unlikely to be achieved by shrinking chip
feature sizes, which are currently about 100 silicon atoms. Running
10^10 such computers to automate the economy, if it were possible,
would produce 10^17 W of waste heat. I calculated that this would
raise the Earth's average temperature from 15 C to 51 C.(123 F).

2. I did some data compression experiments to more accurately estimate
the information content of both DNA and source code. I concluded that
1 line of code is about 25 bits, and that 1 DNA base pair barely
compresses below 2 bits. Thus, a program of equivalent complexity to a
human would cost $25 billion to write at $100 per line. This only has
to be written once, so it is negligible compared to the cost of
hardware and knowledge collection.

3. Table 2 summarizes the complexity, CPU, and memory requirements of
4 approaches to AI: engineered, evolutionary, cosmological, and
multiverse. These go to progressively lower complexity at
progressively higher hardware costs. This is important because if the
hardware problems can be solved by molecular computing (which is
feasible), then the most significant cost is collecting human
knowledge, which is in the hundreds of trillions of dollars.

Of the 3 alternatives to engineering, only the evolutionary approach
is remotely feasible. It requires 10^49 operations and 10^37 bits of
memory. These are improved estimates based on more accurate references
to carbon production by the world biomass. It is also highly energy
efficient at 10^-18 J per operation (10^4 times better than the brain
and 10^9 times better than silicon). Nevertheless, that's still 3
billion years at 7x our current global energy consumption. It is
theoretically possible to improve the energy efficiency by a factor of
400 to reach the thermodynamic limit, and by another 1000 by capturing
more sunlight, but this would only speed up the simulation of human
evolution to 10,000 years. Other optimizations might still make it
feasible.

The real promise here is to use molecular computing in an engineering
approach, both to build the computing hardware and to directly scan
brains using nano-robots to collect human knowledge without the high
cost of conventional communication. There is a high risk of a
nanotechnology accident in this approach, however. I will have more to
add on this.


--
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to