On Thu, 23 Oct 2008 11:25:28 -0700, Jeffrey Nagelbush wrote: >I recently ran across the following quote on line: "It is a little-known >fact that the human brain has 10 times the memory capacity of the >National Archives." >(The site is http://findarticles.com/p/articles/mi_m0JSD/is_/ai_77196252)
It is not at all clear what the above statement means because it implies that there is a common metric for measuring the information contained in the human brain as well as in the National Archives. As far as I know, there is no such metric. More below. >I was wondering if anyone has actually tried to measure the >capacity of the human memory and come up with an estimate? >I have not succeeded in finding an estimate, though I have not >searched thoroughly. Let's start by reviewing what the capacity is of what has traditionally been referred to as "short-term memory". In the heyday of information theory it was thought that one should be able to measure the capacity of short-term memory in "bits". Indeed, one of the major reasons for the fame of George Miller's "Magical Number Seven" paper is that, because of the process of recoding/chunking, measurement in bits is meaningless. Seven zero-one digits can be thought of seven bits or as a single chunk (e.g., 0000001). One can then ask "how large can a chunk be?" As Chase & Ericson (1982) showed with their subjects "SF" and "DD", there doesn't appear to be a limit. That is, as one continues to become more skilled in recoding information, the larger the organized unit can become. Increasing the organization of information reduces the "memory load" such that memory units with large amounts of bits do not take up more memory. If this is true for short-term memory, it is quite likely to be true for long-term memory (especially for theories of memory that assert that short-term memory is just the activation of long-term memory). This raises the question of whether a finite entity like the brain can contain an infinite amount of information. I'm sure that someone has addressed this point at some time and the answer has to be "No". What then is the limit of human memory? Again, it is not clear what the appropriate metric is. And, I think, there is another mechanism at work. Another way of thinking about recoding/chunking is that it like a "compression" process like the old pkzip (which produced *.zip files) and the newer "rar" and other formats. There are a large number of compression algorithms available now (I defer to the more computer literate to expound about this) and one can ask "which algorithm produces the smallest compressed file" or "which algorithm most quickly compresses information" and so on. If the mind/brain uses some algorithm(s) to compress information about the world (as well as internal states), then we won't really know the capacity of human memory until we know what that compression algorithm is. Or something like that. -Mike Palij New York University [EMAIL PROTECTED] --- To make changes to your subscription contact: Bill Southerly ([EMAIL PROTECTED])
