RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg
 
The total number of particles in the whole universe is usually
estimated to be around 10^80.  These guys claim that the storage
of the brain is 10^8432 bits.  That means that my brain has around
10^8352 bits of storage for every particle in the whole universe.

I thought I was feeling smarter than usual this morning!

Possible explanations:

1) The quote to totally wrong the the ^ should be a , ?

2) They got confused and thought it was 1 April

3) They are actually doing research into just how flaky AI
   researchers really are and how easy it is to publish
   mathematical nonsense in Mind and Brain Journal

4) The scientists somehow managed to get their PhDs without
   understanding how numbers work

5) They concluded that the brain is really analogue and so they
   worked out the volume of the skull at the Planck scale (actually
   that doesn't work either as the Planck length is far far far to
   large at 1.6 x 10^-35 m)

and so on...

Does anybody have a better explanation?

Shane


--- Amara D. Angelica [EMAIL PROTECTED] wrote: 
http://www.kurzweilai.net/news/news_printable.html?id=2417  
 
 Discovering the Capacity of Human Memory
 
 Brain and Mind, August 2003
 
 
 The memory capacity of the human brain is on the order of 10^8432 bits,
 three scientists have estimated. 
 
 Writing in the August issue of Brain and Mind, their OAR cognitive
 model asserts that human memory and knowledge are represented by a
 network of relations, i.e., connections of synapses between neurons,
 rather than by the neurons themselves as in the traditional
 information-container model (1 neuron = 1 bit). 
 
 This explains why the magnitude of neurons in an adult brain seems
 stable; however, huge amount of information can be remembered throughout
 the entire life of a person, they point out. 
 
 Based on the projected computer memory capacity of 8 x 10^12 bits in the
 next ten years, Yingxu Wang et al. conclude that the memory capacity of
 a human brain is equivalent to at least 10^8419 modern
 computersThis tremendous difference of memory magnitudes between
 human beings and computers demonstrates the efficiency of information
 representation, storage, and processing in the human brains. 
 
 They also conclude that this new factor has revealed the tremendous
 quantitative gap between the natural and machine intelligence and that
 next-generation computer memory systems may be built according to their
 relational model rather than the traditional container metaphor because
 the former is more powerful, flexible, and efficient, and is capable of
 generating a mathematically unlimited memory capacity by using limited
 number of neurons in the brain or hardware cells in the next generation
 computers. 
 
 Brain and Mind 4 (2): 189-198, August 2003
 
 ---
 To unsubscribe, change your address, or temporarily deactivate your subscription, 
 please go to http://v2.listbox.com/member/[EMAIL PROTECTED] 


Want to chat instantly with your online friends?  Get the FREE Yahoo!
Messenger http://mail.messenger.yahoo.co.uk

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Brad Wyble
Good point Shane, I didn't even pay attention to the ludicrous size of the 
number, so keen was I to get my rant out.  



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Pei Wang
The paper can be accessed at
http://www.enel.ucalgary.ca/People/wangyx/Publications/Papers/BM-Vol4.2-HMC.pdf

Their conclusion is based on the assumptions that there are 10^11 neurons
and their average synapses number is 10^3. Therefore the total potential
relational combinations is
(10^11)! / (10^3)! ((10^11)! - (10^3)!), which is approximately 10^8432.

The model is obviously an oversimplification, and the number is way too big.

Pei

- Original Message - 
From: shane legg [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Tuesday, September 16, 2003 6:24 AM
Subject: RE: [agi] Discovering the Capacity of Human Memory



 The total number of particles in the whole universe is usually
 estimated to be around 10^80.  These guys claim that the storage
 of the brain is 10^8432 bits.  That means that my brain has around
 10^8352 bits of storage for every particle in the whole universe.

 I thought I was feeling smarter than usual this morning!

 Possible explanations:

 1) The quote to totally wrong the the ^ should be a , ?

 2) They got confused and thought it was 1 April

 3) They are actually doing research into just how flaky AI
researchers really are and how easy it is to publish
mathematical nonsense in Mind and Brain Journal

 4) The scientists somehow managed to get their PhDs without
understanding how numbers work

 5) They concluded that the brain is really analogue and so they
worked out the volume of the skull at the Planck scale (actually
that doesn't work either as the Planck length is far far far to
large at 1.6 x 10^-35 m)

 and so on...

 Does anybody have a better explanation?

 Shane


 --- Amara D. Angelica [EMAIL PROTECTED] wrote: 
 http://www.kurzweilai.net/news/news_printable.html?id=2417
 
  Discovering the Capacity of Human Memory
 
  Brain and Mind, August 2003
 
 
  The memory capacity of the human brain is on the order of 10^8432 bits,
  three scientists have estimated.
 
  Writing in the August issue of Brain and Mind, their OAR cognitive
  model asserts that human memory and knowledge are represented by a
  network of relations, i.e., connections of synapses between neurons,
  rather than by the neurons themselves as in the traditional
  information-container model (1 neuron = 1 bit).
 
  This explains why the magnitude of neurons in an adult brain seems
  stable; however, huge amount of information can be remembered throughout
  the entire life of a person, they point out.
 
  Based on the projected computer memory capacity of 8 x 10^12 bits in the
  next ten years, Yingxu Wang et al. conclude that the memory capacity of
  a human brain is equivalent to at least 10^8419 modern
  computersThis tremendous difference of memory magnitudes between
  human beings and computers demonstrates the efficiency of information
  representation, storage, and processing in the human brains.
 
  They also conclude that this new factor has revealed the tremendous
  quantitative gap between the natural and machine intelligence and that
  next-generation computer memory systems may be built according to their
  relational model rather than the traditional container metaphor because
  the former is more powerful, flexible, and efficient, and is capable of
  generating a mathematically unlimited memory capacity by using limited
  number of neurons in the brain or hardware cells in the next generation
  computers.
 
  Brain and Mind 4 (2): 189-198, August 2003
 
  ---
  To unsubscribe, change your address, or temporarily deactivate your
subscription,
  please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

 
 Want to chat instantly with your online friends?  Get the FREE Yahoo!
 Messenger http://mail.messenger.yahoo.co.uk

 ---
 To unsubscribe, change your address, or temporarily deactivate your
subscription,
 please go to http://v2.listbox.com/member/[EMAIL PROTECTED]



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Amara D. Angelica

 1) The quote to totally wrong the the ^ should be a , ?

It's 10 to the 8432 power, according to the paper. This is the
theoretical memory capacity, not its actual size, but no estimates are
given for real-world typical size of memory, so the comparison with
machine capacity seems unrealistic.

The number is derived as follows:

Assuming there are n neurons in the brain, and on average
there are m connections between a given neuron and the rest of them, the
magnitude of the brain memory capacity can be expressed by the following
mathematical model, the human memory capacity model, as given below:

n!/[m!(n-m)!]

where n is the total number of neurons and m the number of average
partial connections between neurons.

However, this is extremely hard to calculate and is almost intractable
using a modern computer, because of the exponential complicity or the
recursive computational costs for such large n and m, so they did some
math tricks to estimate it. 

One issue I didn't see addressed in the paper is the constraint on
neurons actually being physically able to connect with distant ones. In
a real world computation, shouldn't the upper bound be dramatically
lower? 

By the way, not that it has any bearing on reality, but it's actually
10^90 bits that could be stored by the amount of matter that we have in
the universe right now, according to Seth Lloyd
(http://www.kurzweilai.net/meme/frame.html?main=/articles/art0530.html?m
%3D3), so at max capacity, each brain would require on the order of
10^8342 parallel universes to be converted to computronium. That's one
heck of a supercomputer. :)



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg

Thanks for the link Pei.

The thing is that they are talking about the number of BITS not the
number of POSSIBLE STATES.  Given x bits the number of possible
states is 2^x.  For example with 32 bits you can have 2^32 different
states... or about 4,000,000,000 possible states.  

Thus, if the brain has 10^8432 bits of storage as they claim, then
the number of possible states is 2^(10^8432). 

To make things even worse, even if they realise their error and
decided that they didn't understand what a bit is and that they
actually meant possible states, the number of bits in this case
then becomes just log_2 (10^8432) = 8432 * log_2 (10) = 28,010 bits
or about 3.5 kilo bytes of storage.  I'd like to think that I have
more than a 3.5 Kb brain!!

They really should have sanity checked their results.

Shane


 --- Pei Wang [EMAIL PROTECTED] wrote:  The paper can be accessed at
 http://www.enel.ucalgary.ca/People/wangyx/Publications/Papers/BM-Vol4.2-HMC.pdf
 
 Their conclusion is based on the assumptions that there are 10^11 neurons
 and their average synapses number is 10^3. Therefore the total potential
 relational combinations is
 (10^11)! / (10^3)! ((10^11)! - (10^3)!), which is approximately 10^8432.
 
 The model is obviously an oversimplification, and the number is way too big.
 
 Pei
 
 - Original Message - 
 From: shane legg [EMAIL PROTECTED]
 To: [EMAIL PROTECTED]
 Sent: Tuesday, September 16, 2003 6:24 AM
 Subject: RE: [agi] Discovering the Capacity of Human Memory
 
 
 
  The total number of particles in the whole universe is usually
  estimated to be around 10^80.  These guys claim that the storage
  of the brain is 10^8432 bits.  That means that my brain has around
  10^8352 bits of storage for every particle in the whole universe.
 
  I thought I was feeling smarter than usual this morning!
 
  Possible explanations:
 
  1) The quote to totally wrong the the ^ should be a , ?
 
  2) They got confused and thought it was 1 April
 
  3) They are actually doing research into just how flaky AI
 researchers really are and how easy it is to publish
 mathematical nonsense in Mind and Brain Journal
 
  4) The scientists somehow managed to get their PhDs without
 understanding how numbers work
 
  5) They concluded that the brain is really analogue and so they
 worked out the volume of the skull at the Planck scale (actually
 that doesn't work either as the Planck length is far far far to
 large at 1.6 x 10^-35 m)
 
  and so on...
 
  Does anybody have a better explanation?
 
  Shane
 
 
  --- Amara D. Angelica [EMAIL PROTECTED] wrote: 
  http://www.kurzweilai.net/news/news_printable.html?id=2417
  
   Discovering the Capacity of Human Memory
  
   Brain and Mind, August 2003
  
  
   The memory capacity of the human brain is on the order of 10^8432 bits,
   three scientists have estimated.
  
   Writing in the August issue of Brain and Mind, their OAR cognitive
   model asserts that human memory and knowledge are represented by a
   network of relations, i.e., connections of synapses between neurons,
   rather than by the neurons themselves as in the traditional
   information-container model (1 neuron = 1 bit).
  
   This explains why the magnitude of neurons in an adult brain seems
   stable; however, huge amount of information can be remembered throughout
   the entire life of a person, they point out.
  
   Based on the projected computer memory capacity of 8 x 10^12 bits in the
   next ten years, Yingxu Wang et al. conclude that the memory capacity of
   a human brain is equivalent to at least 10^8419 modern
   computersThis tremendous difference of memory magnitudes between
   human beings and computers demonstrates the efficiency of information
   representation, storage, and processing in the human brains.
  
   They also conclude that this new factor has revealed the tremendous
   quantitative gap between the natural and machine intelligence and that
   next-generation computer memory systems may be built according to their
   relational model rather than the traditional container metaphor because
   the former is more powerful, flexible, and efficient, and is capable of
   generating a mathematically unlimited memory capacity by using limited
   number of neurons in the brain or hardware cells in the next generation
   computers.
  
   Brain and Mind 4 (2): 189-198, August 2003
  
   ---
   To unsubscribe, change your address, or temporarily deactivate your
 subscription,
   please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
 
  
  Want to chat instantly with your online friends?  Get the FREE Yahoo!
  Messenger http://mail.messenger.yahoo.co.uk
 
  ---
  To unsubscribe, change your address, or temporarily deactivate your
 subscription,
  please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
 
 
 
 ---
 To unsubscribe, change your address, or temporarily deactivate

RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Brad Wyble


It's also disconcerting that something like this can make it through the 
review process.

Transdisciplinary is oftentimes a pseudonym for combining half-baked and 
ill-formed ideas from multiple domains into an incoherent mess. 

This paper is an excellent example.  (bad math + bad neuroscience != good 
paper)





---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Bill Hibbard
On Mon, 15 Sep 2003, Amara D. Angelica wrote:

 Any commments on this paper?

 http://www.kluweronline.com/issn/1389-1987/current

Anders Sandberg's PhD thesis (thanks to Cole Kitchen for
originally posting this to the AGI list) at:

  http://akira.nada.kth.se/~asa/Thesis/thesis.pdf

entitled Bayesian Attractor Neural Network Models of
Memory, provides a more reasonable basis for estimating
human memory capacity. In Section 8.1 he roughly
estimates the capacity of the cortex at 10^10 patterns.

Bill

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread shane legg
Yeah, it's a bit of a worry.

By the way, if anybody is trying to look it up, I spelt the guy's
name wrong, it's actually Stirling's equation.  You can find
it in an online book here:

http://www.inference.phy.cam.ac.uk/mackay/itprnn/book.html

It's a great book, about 640 pages long.  The result I
used is equation 1.13 which is on page 2.

Shane



 --- Brad Wyble [EMAIL PROTECTED] wrote:  
 
 It's also disconcerting that something like this can make it through the 
 review process.
 
 Transdisciplinary is oftentimes a pseudonym for combining half-baked and 
 ill-formed ideas from multiple domains into an incoherent mess. 
 
 This paper is an excellent example.  (bad math + bad neuroscience != good 
 paper)
 
 
 
 
 
 ---
 To unsubscribe, change your address, or temporarily deactivate your subscription, 
 please go to http://v2.listbox.com/member/[EMAIL PROTECTED] 


Want to chat instantly with your online friends?  Get the FREE Yahoo!
Messenger http://mail.messenger.yahoo.co.uk

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread James Rogers
 
 Their conclusion is based on the assumptions that there are 
 10^11 neurons and their average synapses number is 10^3. 
 Therefore the total potential relational combinations is 
 (10^11)! / (10^3)! ((10^11)! - (10^3)!), which is 
 approximately 10^8432.
 
 The model is obviously an oversimplification, and the number 
 is way too big.


I was wondering about that.  It seems that the number represents the size of the
phase space, when a more useful metric would be the size (Kolmogorov complexity)
of the average point *in* the phase space.  There is a world of difference
between the number of patterns that can be encoded and the size of the biggest
pattern that can be encoded; the former isn't terribly important, but the latter
is very important.

Cheers,

-James Rogers
 [EMAIL PROTECTED]



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Eliezer S. Yudkowsky
James Rogers wrote:
I was wondering about that.  It seems that the number represents the size of the
phase space, when a more useful metric would be the size (Kolmogorov complexity)
of the average point *in* the phase space.  There is a world of difference
between the number of patterns that can be encoded and the size of the biggest
pattern that can be encoded; the former isn't terribly important, but the latter
is very important.
Are you talking about the average point in the phase space in the sense 
of an average empirical human brain, or in the sense of a randomly 
selected point in the phase space?  I assume you mean the former, since, 
for the latter question, if you have a simple program P that produces a 
phase space of size 2^X, the average size of a random point in the phase 
space must be roughly X (plus the size of P?) according to both Shannon 
and Kolmogorov.

(Incidentally, I'll join in expressing my astonishment and dismay at the 
level of sheer mathematical and physical and computational ignorance on 
the part of authors and reviewers that must have been necessary for even 
the abstract of this paper to make it past the peer review process, and 
add that the result violates the Susskind holographic bound for an object 
that can be contained in a 1-meter sphere - no more than 10^70 bits of 
information.)

--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread James Rogers
Eliezer wrote:
 Are you talking about the average point in the phase space in the sense 
 of an average empirical human brain, or in the sense of a randomly
 selected point in the phase space?  I assume you mean the former, since, 
 for the latter question, if you have a simple program P that 
 produces a phase space of size 2^X, the average size of a random point 
 in the phase space must be roughly X (plus the size of P?) according to 
 both Shannon and Kolmogorov.


Arrgh...  What you said.  My post was sloppy, and I stated it really badly.  

I'm literally doing about 5-way multitasking today, all important things that
demand my attention.  It seems that my email time-slice is under-performing
under the circumstances.

Cheers,

-James Rogers
 [EMAIL PROTECTED]

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] Discovering the Capacity of Human Memory

2003-09-16 Thread Eliezer S. Yudkowsky
The Tao is the set of truths that can be stored in zero bits.

--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]