> http://www.newscientist.com/article/mg20026845.000-memories-may-be-stored-on-your-dna.html

Actually, this makes sense. It explains most of the discrepancy between the 
10^9 bits of human long term memory estimated by Landauer and the 10^15 
synapses in the human brain. If memory is stored in neurons (by gene regulation 
to control activation threshold), then you have only 10^11 bits of storage, or 
1 bit per neuron.

Here is how it could work. Imagine a neural network with fixed, randomly 
weighted synapses. Then insert a neuron at each synapse with one input and one 
output. Then you could apply Hebbian learning by modifying the conductivity of 
the middle neuron. If the input and output neurons fire at the same time, then 
the middle neuron would lower its threshold if both weights are the same, or 
raise it if the weights have opposite sign. In other words, instead of 

A -> B

with a variable weight, you have

A -> M -> B

with a middle neuron M of variable conductivity and two fixed weights.

Of course real neurons have thousands of inputs and outputs. This means that 
there are thousands of neurons between A and B, and these middle neurons 
connect to thousands of others. If these connections are random, then Hebbian 
learning applied to these thousands of middle neurons would correlate only with 
AB and create minor noise for other neurons.

-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com

Reply via email to