Stathis Papaioannou wrote:
> 2009/8/22 Brent Meeker <>:
>> That's an interesting question and one that I think relates to the
>> importance of context.  A scan of your brain would capture all the
>> information in the Shannon/Boltzman sense, i.e. it would determine which
>> of the possible configurations and processes were realized.  However,
>> those concerned about the "hard problem", will point out that this
>> misses the fact that the information represents or "means" something.
>> To know the meaning of the information would require knowledge of the
>> world in which the brain acts and perceives, including a lot of
>> evolutionary history.  Image scanning the brain of an alien found  in a
>> crash at Roswell.  Without knowledge of how he acts and the evolutionary
>> history of his species it would be essentially impossible to guess the
>> meaning of the patterns in his brain.  My point is that it is not just
>> computation that is consciousness or cognition, but computation with
>> meaning, which means within a certain context of action.
> You wouldn't be able to guess what the alien is thinking by scanning
> his brain, but you could then run a simulation, exposing it to various
> environmental stimuli, and it should behave the same way as the
> original brain (if weak AI is true) and have the same experiences as
> the original brain (if strong AI is true).

True.  But the point was directed at the MGA.  Part of the simulation 
must be outside the brain - and possibly are very great deal.  So 
while it seems intuitively clear that the brain can be emulated, it's 
not so clear that the brain + enough environment can be.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to
To unsubscribe from this group, send email to
For more options, visit this group at

Reply via email to