On Fri, May 3, 2013 at 1:19 PM, just camel <[email protected]> wrote:
> I haven't understood how OpenCog Prime would fail when it comes to
> recombining known concepts and stuff? In what way is our brain superior?
> Searching through a list, hash, neurons and synapses, atom space and
> recombining bits of information and nodes ... all the same to me? You have a
> pool of information and patterns and an algorithm or a method of iterating
> through that storage and applying some logic to them?

As a data structure for representing human knowledge and implementing
reasoning and learning, there is nothing wrong with OpenCog and
AtomSpace. What OpenCog lacks is the computing power and knowledge
base to implement it at the level of human intelligence.

We have discussed this before. Ben disagrees with my numbers based
solely on intuition, and refuses to offer any alternative numbers. He
simply doesn't know. But he still doesn't like my answers.
Intuitively, the amount of resources he needs is exactly what he is
able to obtain, because otherwise he would fail.

Just to remind you, the specs for Homo Sapiens:
- Computing power: 10^15 neural weighed summations per second.
- Memory: 10^14 synapses.
- Sensory input rate: 10^9 bits per second (10^7 bps at optic nerve).
- Software: 3 x 10^9 DNA base pairs, equivalent to 300M lines of code.
- Knowledge base: 10^9 bits (99% shared with other humans).
- Natural language I/O: 5 bits per second.
- Power: 100 watts average, 1000 watts peak (25% mechanical efficiency).
- MTBF: 2 x 10^9 seconds.

Certain problems such as arithmetic and chess are amenable to more
efficient solutions on our primitive computers. But hard problems like
vision, language, art, and robotics have eluded the obvious approaches
that have already been tried. The most promising approaches are based
on neural networks, which implies that the above numbers are relevant.

Writing 300 million lines of code would cost USD $30 billion. But that
is *not* the major cost. It only needs to be written once. It is $5
per worker replaced.

The major costs are electricity and knowledge collection. A 1 petaflop
computer (1 human brain equivalent) uses 1 MW, which costs $100 per
hour, or 20 times the global average wage rate. Moore's Law can reduce
this somewhat, but simply shrinking components will not work because
feature sizes are already down to 100 silicon atoms, and you can't
make them much smaller.

The other cost is the 10^17 bits of global human knowledge needed to
automate the economy, which has to be collected through speech and
writing from everyone on the planet at a cost of about $0.001 to $0.01
per bit, or $100 trillion to $1 quadrillion. Moore's Law won't help
you here.

If you want to solve AGI, the I suggest working on (1) hardware
designs and algorithms to reduce power consumption and (2) global
public surveillance systems to reduce the cost of collecting human
knowledge. Replacing human labor with machines would save $70 trillion
per year worldwide in wages. The funding is there.


-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to