Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Eliezer S. Yudkowsky
The Tao is the set of truths that can be stored in zero bits.

--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread James Rogers
Eliezer wrote:
> Are you talking about the "average" point in the phase space in the sense 
> of an average empirical human brain, or in the sense of a randomly
> selected point in the phase space?  I assume you mean the former, since, 
> for the latter question, if you have a simple program P that 
> produces a phase space of size 2^X, the average size of a random point 
> in the phase space must be roughly X (plus the size of P?) according to 
> both Shannon and Kolmogorov.


Arrgh...  What you said.  My post was sloppy, and I stated it really badly.  

I'm literally doing about 5-way multitasking today, all "important" things that
demand my attention.  It seems that my email time-slice is under-performing
under the circumstances.

Cheers,

-James Rogers
 [EMAIL PROTECTED]

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Eliezer S. Yudkowsky
James Rogers wrote:
I was wondering about that.  It seems that the number represents the size of the
phase space, when a more useful metric would be the size (Kolmogorov complexity)
of the average point *in* the phase space.  There is a world of difference
between the number of patterns that can be encoded and the size of the biggest
pattern that can be encoded; the former isn't terribly important, but the latter
is very important.
Are you talking about the "average" point in the phase space in the sense 
of an average empirical human brain, or in the sense of a randomly 
selected point in the phase space?  I assume you mean the former, since, 
for the latter question, if you have a simple program P that produces a 
phase space of size 2^X, the average size of a random point in the phase 
space must be roughly X (plus the size of P?) according to both Shannon 
and Kolmogorov.

(Incidentally, I'll join in expressing my astonishment and dismay at the 
level of sheer mathematical and physical and computational ignorance on 
the part of authors and reviewers that must have been necessary for even 
the abstract of this paper to make it past the peer review process, and 
add that the result violates the Susskind holographic bound for an object 
that can be contained in a 1-meter sphere - no more than 10^70 bits of 
information.)

--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread James Rogers
> 
> Their conclusion is based on the assumptions that there are 
> 10^11 neurons and their average synapses number is 10^3. 
> Therefore the total potential relational combinations is 
> (10^11)! / (10^3)! ((10^11)! - (10^3)!), which is 
> approximately 10^8432.
> 
> The model is obviously an oversimplification, and the number 
> is way too big.


I was wondering about that.  It seems that the number represents the size of the
phase space, when a more useful metric would be the size (Kolmogorov complexity)
of the average point *in* the phase space.  There is a world of difference
between the number of patterns that can be encoded and the size of the biggest
pattern that can be encoded; the former isn't terribly important, but the latter
is very important.

Cheers,

-James Rogers
 [EMAIL PROTECTED]



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Ben Goertzel



Of course, doing this combinational calculation is really not all that
meaningful  I'm not sure why we want to calculate unordered sets of
neural interconnections, when in fact temporal dynamics may be important to
the brain, so that in some cases the same set of synapses considered in a
different temporal order might "denote" different things (using the word
"denote" very loosely here).

Shane's 10^10 bits figure is very nice as it's just around a gigabyte, i.e.
the amount of RAM on a juiced-up contemporary PC.   (Though, of course, to
emulate human thought in real-time one would need a lot of processors per
each gigabyte of RAM).

Based on a number of different calculations I've seen, my suspicion is that
this is off by at worst a couple orders of magnitude.  This is why I believe
that, at present, AI is most probably less of a hardware problem (or a
software implementation problem) and more of a "create a workable design"
problem.

However, to make a side point, it's also the case that it requires more
resources to TEST and CREATE an AI than to run one.  For our work on
Novamente, we run a lot of empirical tests to understand the behavior of
various system components, and to run tests on N copies of a Novamente
system obviously requires N times the resources one would need to simply run
a Novamente system.

This is similar to a well-known fact about text and image compression
research --- that doing this kind of research requires huge amounts of
memory on one's computers ;-)  [even though what one is ultimately getting
at is *compression*] !

-- Ben

> > "Assuming there are n neurons in the brain, and on average
> > there are m connections between a given neuron and the rest of them, the
> > magnitude of the brain memory capacity can be expressed by the following
> > mathematical model, the human memory capacity model, as given below:
> >
> > n!/[m!(n-m)!]
> >
> > where n is the total number of neurons and m the number of average
> > partial connections between neurons.
> >
> > However, this is "extremely hard to calculate and is almost intractable
> > using a modern computer, because of the exponential complicity or the
> > recursive computational costs for such large n and m," so they did some
> > math tricks to estimate it.
>
> Well the standard way to estimate (sorry about reverting to LaTeX here
> for those of you who aren't mathematicians... ) this combination is
> to use the log from of Sterling's equation:
>
> \log_2 \binom{n}{m} = (n-m) \log_2 \frac{n}{n-m} + m \log_2 \frac{n}{m}
>
> here n = 10^9 (neurons in brain)
> and  m = 10^4 (10,000 connections per neuron)
>
> These are my numbers, if they use slightly different ones it won't
> matter too much.
>
> This gives approx,
>
> 10^9 \log_2 10^5 + 10^4 \log_2 10^5
>
> which is about 10^10 bits which is about what I'd expect very roughly.
> They should have talked to a mathematician first.  Estimating basic
> combinatorics like n!/[m!(n-m)!] is not hard.
>
> Shane
>
>
> 
> Want to chat instantly with your online friends?  Get the FREE Yahoo!
> Messenger http://mail.messenger.yahoo.co.uk
>
> ---
> To unsubscribe, change your address, or temporarily deactivate
> your subscription,
> please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
>

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg
Yeah, it's a bit of a worry.

By the way, if anybody is trying to look it up, I spelt the guy's
name wrong, it's actually "Stirling's equation".  You can find
it in an online book here:

http://www.inference.phy.cam.ac.uk/mackay/itprnn/book.html

It's a great book, about 640 pages long.  The result I
used is equation 1.13 which is on page 2.

Shane



 --- Brad Wyble <[EMAIL PROTECTED]> wrote: > 
> 
> It's also disconcerting that something like this can make it through the 
> review process.
> 
> Transdisciplinary is oftentimes a pseudonym for combining half-baked and 
> ill-formed ideas from multiple domains into an incoherent mess. 
> 
> This paper is an excellent example.  (bad math + bad neuroscience != good 
> paper)
> 
> 
> 
> 
> 
> ---
> To unsubscribe, change your address, or temporarily deactivate your subscription, 
> please go to http://v2.listbox.com/member/[EMAIL PROTECTED] 


Want to chat instantly with your online friends?  Get the FREE Yahoo!
Messenger http://mail.messenger.yahoo.co.uk

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Bill Hibbard
On Mon, 15 Sep 2003, Amara D. Angelica wrote:

> Any commments on this paper?
>
> http://www.kluweronline.com/issn/1389-1987/current

Anders Sandberg's PhD thesis (thanks to Cole Kitchen for
originally posting this to the AGI list) at:

  http://akira.nada.kth.se/~asa/Thesis/thesis.pdf

entitled Bayesian Attractor Neural Network Models of
Memory, provides a more reasonable basis for estimating
human memory capacity. In Section 8.1 he roughly
estimates the capacity of the cortex at 10^10 "patterns".

Bill

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Brad Wyble


It's also disconcerting that something like this can make it through the 
review process.

Transdisciplinary is oftentimes a pseudonym for combining half-baked and 
ill-formed ideas from multiple domains into an incoherent mess. 

This paper is an excellent example.  (bad math + bad neuroscience != good 
paper)





---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg
 --- "Amara D. Angelica" <[EMAIL PROTECTED]> wrote: > 
> > 1) The quote to totally wrong the the "^" should be a "," ?
> 
> It's 10 to the 8432 power, according to the paper. This is the
> theoretical memory capacity, not its actual size, but no estimates are
> given for real-world typical size of memory, so the comparison with
> machine capacity seems unrealistic.

Really, they should have realised that 10^8432 bits makes no sense
at all.


> The number is derived as follows:
> 
> "Assuming there are n neurons in the brain, and on average
> there are m connections between a given neuron and the rest of them, the
> magnitude of the brain memory capacity can be expressed by the following
> mathematical model, the human memory capacity model, as given below:
> 
> n!/[m!(n-m)!]
> 
> where n is the total number of neurons and m the number of average
> partial connections between neurons.
> 
> However, this is "extremely hard to calculate and is almost intractable
> using a modern computer, because of the exponential complicity or the
> recursive computational costs for such large n and m," so they did some
> math tricks to estimate it. 

Well the standard way to estimate (sorry about reverting to LaTeX here
for those of you who aren't mathematicians... ) this combination is
to use the log from of Sterling's equation:

\log_2 \binom{n}{m} = (n-m) \log_2 \frac{n}{n-m} + m \log_2 \frac{n}{m}

here n = 10^9 (neurons in brain)
and  m = 10^4 (10,000 connections per neuron)

These are my numbers, if they use slightly different ones it won't
matter too much.

This gives approx,

10^9 \log_2 10^5 + 10^4 \log_2 10^5

which is about 10^10 bits which is about what I'd expect very roughly.
They should have talked to a mathematician first.  Estimating basic
combinatorics like n!/[m!(n-m)!] is not hard.

Shane



Want to chat instantly with your online friends?  Get the FREE Yahoo!
Messenger http://mail.messenger.yahoo.co.uk

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg

Thanks for the link Pei.

The thing is that they are talking about the number of BITS not the
number of POSSIBLE STATES.  Given x bits the number of possible
states is 2^x.  For example with 32 bits you can have 2^32 different
states... or about 4,000,000,000 possible states.  

Thus, if the brain has 10^8432 bits of storage as they claim, then
the number of possible states is 2^(10^8432). 

To make things even worse, even if they realise their error and
decided that they didn't understand what a bit is and that they
actually meant "possible states", the number of bits in this case
then becomes just log_2 (10^8432) = 8432 * log_2 (10) = 28,010 bits
or about 3.5 kilo bytes of storage.  I'd like to think that I have
more than a 3.5 Kb brain!!

They really should have "sanity checked" their results.

Shane


 --- Pei Wang <[EMAIL PROTECTED]> wrote: > The paper can be accessed at
> http://www.enel.ucalgary.ca/People/wangyx/Publications/Papers/B&M-Vol4.2-HMC.pdf
> 
> Their conclusion is based on the assumptions that there are 10^11 neurons
> and their average synapses number is 10^3. Therefore the total potential
> relational combinations is
> (10^11)! / (10^3)! ((10^11)! - (10^3)!), which is approximately 10^8432.
> 
> The model is obviously an oversimplification, and the number is way too big.
> 
> Pei
> 
> - Original Message - 
> From: "shane legg" <[EMAIL PROTECTED]>
> To: <[EMAIL PROTECTED]>
> Sent: Tuesday, September 16, 2003 6:24 AM
> Subject: RE: [agi] Discovering the Capacity of Human Memory
> 
> 
> >
> > The total number of particles in the whole universe is usually
> > estimated to be around 10^80.  These guys claim that the storage
> > of the brain is 10^8432 bits.  That means that my brain has around
> > 10^8352 bits of storage for every particle in the whole universe.
> >
> > I thought I was feeling smarter than usual this morning!
> >
> > Possible explanations:
> >
> > 1) The quote to totally wrong the the "^" should be a "," ?
> >
> > 2) They got confused and thought it was 1 April
> >
> > 3) They are actually doing research into just how flaky AI
> >researchers really are and how easy it is to publish
> >mathematical nonsense in "Mind and Brain" Journal
> >
> > 4) The "scientists" somehow managed to get their PhDs without
> >understanding how numbers work
> >
> > 5) They concluded that the brain is really analogue and so they
> >worked out the volume of the skull at the Planck scale (actually
> >that doesn't work either as the Planck length is far far far to
> >large at 1.6 x 10^-35 m)
> >
> > and so on...
> >
> > Does anybody have a better explanation?
> >
> > Shane
> >
> >
> > --- "Amara D. Angelica" <[EMAIL PROTECTED]> wrote: >
> > http://www.kurzweilai.net/news/news_printable.html?id=2417
> > >
> > > Discovering the Capacity of Human Memory
> > >
> > > Brain and Mind, August 2003
> > >
> > >
> > > The memory capacity of the human brain is on the order of 10^8432 bits,
> > > three scientists have estimated.
> > >
> > > Writing in the August issue of Brain and Mind, their "OAR" cognitive
> > > model asserts that human memory and knowledge are represented by a
> > > network of relations, i.e., connections of synapses between neurons,
> > > rather than by the neurons themselves as in the traditional
> > > information-container model (1 neuron = 1 bit).
> > >
> > > This explains why "the magnitude of neurons in an adult brain seems
> > > stable; however, huge amount of information can be remembered throughout
> > > the entire life of a person," they point out.
> > >
> > > Based on the projected computer memory capacity of 8 x 10^12 bits in the
> > > next ten years, Yingxu Wang et al. conclude that the memory capacity of
> > > a human brain is equivalent to at least "10^8419 modern
> > > computersThis tremendous difference of memory magnitudes between
> > > human beings and computers demonstrates the efficiency of information
> > > representation, storage, and processing in the human brains."
> > >
> > > They also conclude that "this new factor has revealed the tremendous
> > > quantitative gap between the natural and machine intelligence" and that
> > > "next-generation computer memory systems may be built according to their
> > > relational model rath

RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Amara D. Angelica

> 1) The quote to totally wrong the the "^" should be a "," ?

It's 10 to the 8432 power, according to the paper. This is the
theoretical memory capacity, not its actual size, but no estimates are
given for real-world typical size of memory, so the comparison with
machine capacity seems unrealistic.

The number is derived as follows:

"Assuming there are n neurons in the brain, and on average
there are m connections between a given neuron and the rest of them, the
magnitude of the brain memory capacity can be expressed by the following
mathematical model, the human memory capacity model, as given below:

n!/[m!(n-m)!]

where n is the total number of neurons and m the number of average
partial connections between neurons.

However, this is "extremely hard to calculate and is almost intractable
using a modern computer, because of the exponential complicity or the
recursive computational costs for such large n and m," so they did some
math tricks to estimate it. 

One issue I didn't see addressed in the paper is the constraint on
neurons actually being physically able to connect with distant ones. In
a real world computation, shouldn't the upper bound be dramatically
lower? 

By the way, not that it has any bearing on reality, but it's actually
10^90 bits that could be stored by the amount of matter that we have in
the universe right now, according to Seth Lloyd
(http://www.kurzweilai.net/meme/frame.html?main=/articles/art0530.html?m
%3D3), so at max capacity, each brain would require on the order of
10^8342 parallel universes to be converted to computronium. That's one
heck of a supercomputer. :)



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Pei Wang
The paper can be accessed at
http://www.enel.ucalgary.ca/People/wangyx/Publications/Papers/B&M-Vol4.2-HMC.pdf

Their conclusion is based on the assumptions that there are 10^11 neurons
and their average synapses number is 10^3. Therefore the total potential
relational combinations is
(10^11)! / (10^3)! ((10^11)! - (10^3)!), which is approximately 10^8432.

The model is obviously an oversimplification, and the number is way too big.

Pei

- Original Message - 
From: "shane legg" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Tuesday, September 16, 2003 6:24 AM
Subject: RE: [agi] Discovering the Capacity of Human Memory


>
> The total number of particles in the whole universe is usually
> estimated to be around 10^80.  These guys claim that the storage
> of the brain is 10^8432 bits.  That means that my brain has around
> 10^8352 bits of storage for every particle in the whole universe.
>
> I thought I was feeling smarter than usual this morning!
>
> Possible explanations:
>
> 1) The quote to totally wrong the the "^" should be a "," ?
>
> 2) They got confused and thought it was 1 April
>
> 3) They are actually doing research into just how flaky AI
>researchers really are and how easy it is to publish
>mathematical nonsense in "Mind and Brain" Journal
>
> 4) The "scientists" somehow managed to get their PhDs without
>understanding how numbers work
>
> 5) They concluded that the brain is really analogue and so they
>worked out the volume of the skull at the Planck scale (actually
>that doesn't work either as the Planck length is far far far to
>large at 1.6 x 10^-35 m)
>
> and so on...
>
> Does anybody have a better explanation?
>
> Shane
>
>
> --- "Amara D. Angelica" <[EMAIL PROTECTED]> wrote: >
> http://www.kurzweilai.net/news/news_printable.html?id=2417
> >
> > Discovering the Capacity of Human Memory
> >
> > Brain and Mind, August 2003
> >
> >
> > The memory capacity of the human brain is on the order of 10^8432 bits,
> > three scientists have estimated.
> >
> > Writing in the August issue of Brain and Mind, their "OAR" cognitive
> > model asserts that human memory and knowledge are represented by a
> > network of relations, i.e., connections of synapses between neurons,
> > rather than by the neurons themselves as in the traditional
> > information-container model (1 neuron = 1 bit).
> >
> > This explains why "the magnitude of neurons in an adult brain seems
> > stable; however, huge amount of information can be remembered throughout
> > the entire life of a person," they point out.
> >
> > Based on the projected computer memory capacity of 8 x 10^12 bits in the
> > next ten years, Yingxu Wang et al. conclude that the memory capacity of
> > a human brain is equivalent to at least "10^8419 modern
> > computersThis tremendous difference of memory magnitudes between
> > human beings and computers demonstrates the efficiency of information
> > representation, storage, and processing in the human brains."
> >
> > They also conclude that "this new factor has revealed the tremendous
> > quantitative gap between the natural and machine intelligence" and that
> > "next-generation computer memory systems may be built according to their
> > relational model rather than the traditional container metaphor" because
> > "the former is more powerful, flexible, and efficient, and is capable of
> > generating a mathematically unlimited memory capacity by using limited
> > number of neurons in the brain or hardware cells in the next generation
> > computers."
> >
> > Brain and Mind 4 (2): 189-198, August 2003
> >
> > ---
> > To unsubscribe, change your address, or temporarily deactivate your
subscription,
> > please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
>
> 
> Want to chat instantly with your online friends?  Get the FREE Yahoo!
> Messenger http://mail.messenger.yahoo.co.uk
>
> ---
> To unsubscribe, change your address, or temporarily deactivate your
subscription,
> please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
>


---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Brad Wyble
Good point Shane, I didn't even pay attention to the ludicrous size of the 
number, so keen was I to get my rant out.  



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg
 
The total number of particles in the whole universe is usually
estimated to be around 10^80.  These guys claim that the storage
of the brain is 10^8432 bits.  That means that my brain has around
10^8352 bits of storage for every particle in the whole universe.

I thought I was feeling smarter than usual this morning!

Possible explanations:

1) The quote to totally wrong the the "^" should be a "," ?

2) They got confused and thought it was 1 April

3) They are actually doing research into just how flaky AI
   researchers really are and how easy it is to publish
   mathematical nonsense in "Mind and Brain" Journal

4) The "scientists" somehow managed to get their PhDs without
   understanding how numbers work

5) They concluded that the brain is really analogue and so they
   worked out the volume of the skull at the Planck scale (actually
   that doesn't work either as the Planck length is far far far to
   large at 1.6 x 10^-35 m)

and so on...

Does anybody have a better explanation?

Shane


--- "Amara D. Angelica" <[EMAIL PROTECTED]> wrote: >
http://www.kurzweilai.net/news/news_printable.html?id=2417  
> 
> Discovering the Capacity of Human Memory
> 
> Brain and Mind, August 2003
> 
> 
> The memory capacity of the human brain is on the order of 10^8432 bits,
> three scientists have estimated. 
> 
> Writing in the August issue of Brain and Mind, their "OAR" cognitive
> model asserts that human memory and knowledge are represented by a
> network of relations, i.e., connections of synapses between neurons,
> rather than by the neurons themselves as in the traditional
> information-container model (1 neuron = 1 bit). 
> 
> This explains why "the magnitude of neurons in an adult brain seems
> stable; however, huge amount of information can be remembered throughout
> the entire life of a person," they point out. 
> 
> Based on the projected computer memory capacity of 8 x 10^12 bits in the
> next ten years, Yingxu Wang et al. conclude that the memory capacity of
> a human brain is equivalent to at least "10^8419 modern
> computersThis tremendous difference of memory magnitudes between
> human beings and computers demonstrates the efficiency of information
> representation, storage, and processing in the human brains." 
> 
> They also conclude that "this new factor has revealed the tremendous
> quantitative gap between the natural and machine intelligence" and that
> "next-generation computer memory systems may be built according to their
> relational model rather than the traditional container metaphor" because
> "the former is more powerful, flexible, and efficient, and is capable of
> generating a mathematically unlimited memory capacity by using limited
> number of neurons in the brain or hardware cells in the next generation
> computers." 
> 
> Brain and Mind 4 (2): 189-198, August 2003
> 
> ---
> To unsubscribe, change your address, or temporarily deactivate your subscription, 
> please go to http://v2.listbox.com/member/[EMAIL PROTECTED] 


Want to chat instantly with your online friends?  Get the FREE Yahoo!
Messenger http://mail.messenger.yahoo.co.uk

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Brad Wyble

We are too far away from understanding the basis of storage in the human 
brain to attempt quantitative estimates such as this.  We don't know 
enough about the number of synapses (the strong assumption is made that 
the average # is 1000).  We don't know enough about the fidelity of 
information stored in those synapses.  We don't know whether there are 
entirely unknown methods of information storage (In RNA, or other 
dendritic methods that have yet to be identified).  We haven't even nailed 
down all the methods of information transmission. (non synaptic forms 
such as Gap junctions, NO)

There's just way too many assumptions involved here. 

Even if we did nail this figure down exactly, the entire concept of memory
bits as the primary feature of the brain's processing power is not well
founded.   What is important in the brain is the way this information is 
stored and the way that it is reconstructed.  You can take your 10 ^ 8432 
bits and still end up with a drastically stupid machine.  The brain seems 
to perform PCA like encodings of its inputs and the compression of 
information that results from this process is one of the key features of 
its function.  This is we are much worse than we think we are at 
remembering little details, our brain stores a skeletal, compressed 
representation.   

Evolution has figured out that the key to intelligence is how you store 
things, not how *much*.  Complexity is cheaper than raw power.

My continuing plea to you hardcore AI enthusiasts, as a neuroscientist, is 
to 
abandon the memory bits and CPU cycle descriptions of brain 
power.  You are running down a blind alley.   
If we could reach 50 years into the future and steal a machine with a 
million petabytes of RAM and a similarly outrageous CPU speed, we would be 
only a tiny fraction of a step closer to simulating the brain.  We'd have 
no idea what to do with that power and the devil is in the details.


-Brad Wyble


---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Amara D. Angelica
http://www.kurzweilai.net/news/news_printable.html?id=2417  

Discovering the Capacity of Human Memory

Brain and Mind, August 2003


The memory capacity of the human brain is on the order of 10^8432 bits,
three scientists have estimated. 

Writing in the August issue of Brain and Mind, their "OAR" cognitive
model asserts that human memory and knowledge are represented by a
network of relations, i.e., connections of synapses between neurons,
rather than by the neurons themselves as in the traditional
information-container model (1 neuron = 1 bit). 

This explains why "the magnitude of neurons in an adult brain seems
stable; however, huge amount of information can be remembered throughout
the entire life of a person," they point out. 

Based on the projected computer memory capacity of 8 x 10^12 bits in the
next ten years, Yingxu Wang et al. conclude that the memory capacity of
a human brain is equivalent to at least "10^8419 modern
computersThis tremendous difference of memory magnitudes between
human beings and computers demonstrates the efficiency of information
representation, storage, and processing in the human brains." 

They also conclude that "this new factor has revealed the tremendous
quantitative gap between the natural and machine intelligence" and that
"next-generation computer memory systems may be built according to their
relational model rather than the traditional container metaphor" because
"the former is more powerful, flexible, and efficient, and is capable of
generating a mathematically unlimited memory capacity by using limited
number of neurons in the brain or hardware cells in the next generation
computers." 

Brain and Mind 4 (2): 189-198, August 2003

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-15 Thread Amara D. Angelica
Should read: 10^8432 bits

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]