On Sat, Oct 25, 2014 at 7:29 AM, Tim Tyler via AGI <[email protected]> wrote:
> At the moment, transferring knowledge isn't really the main problem.
> It is more that our computers are too stupid to learn properly. Once
> computer learning capacities improve to more human-like levels, it
> seems as though the costs associated with teaching them will go down.
> At that stage you will be able to just show them what to do, and use
> 'ordinary' teaching methods - instead of hiring expensive computer
> programmers to instruct them.

I assume that computers will do all of these things. No, right now the
main problem is the lack of energy efficient computation. Many AI
problems, especially vision, require enormous computation. If we
assume that we need a human brain sized neural network, then we need
several billion 10 petaflop computers to automate the global
workforce. Using current technology, each computer would require
several megawatts of power. The cost of electricity alone makes them
uncompetitive with wages. But I don't think the problem is
insurmountable. IBM's TrueNorth spiking neural chips are a few hundred
times as energy efficient as an equivalent neural simulation on a
supercomputer because a synapse operation is 1 bit operation, vs. 1000
bit operations for a 32 bit multiply-accumulate. A cluster of 400,000
chips (256M synapses per chip at 78 mW) simulating a human brain sized
neural network would use only 30 KW. It is unlikely we can reduce this
to the 20 watts used by the brain just by making smaller transistors
in silicon. Transistor features are already down to less than 100
atoms across. We will soon reach the physical limits of transistor
size and power consumption, just as Moore's Law has already stalled on
clock speed several years ago.

My estimate of 10^17 bits of human knowledge assumes that each brain
stores 10^9 bits of useful knowledge and that 99% of this knowledge is
shared. 10^9 bits comes from Landauer's estimate of human long term
memory, and is also the amount of language that you process in a
lifetime at a compressed rate of 1 bit per character. I assume 99%
shared because the U.S. Labor Dept. estimates that it costs $15K (1%
of lifetime earnings) to replace an employee on average. Directly
communicating 10^17 bits at 5 bits per second (150 words per minute
with 50% in each direction) at the global average wage rate of $5 per
hour will cost $28 trillion. I think you can also see that this cost
will go up as the economy grows, and when it comes to collecting
knowledge from people with valuable skills. However, we can probably
reduce costs by using global public surveillance rather than
interviews to collect most of this data, and later perhaps by brain
scanning.

Of course I am assuming that computers can communicate with each other
much faster than with humans, so I am only considering the cost to
humans for their time to show or tell the machines what they need to
know. Also, I am assuming that this knowledge will be collected for
the most part in the form of natural language speech and writing
rather than code. Writing code costs about 1000 times more per bit
than natural language.


-- 
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to