Stathis Papaioannou wrote:
> Brent Meeker writes (quoting SP):
> 
> 
>>> A lot of the stuff criticising Chalmer's thesis is quite strident, at least 
>>> by the usual
>>> academic> > standards. It's not quite as severe as the reaction to Roger 
>>> Penrose's theories
>>> on the mind, but> > almost. Many cognitive scientists seem to take anything 
>>> not clearly
>>> straightforward materialism> > as automatically false or even nonsense. I 
>>> sympathise with
>>> them to a degree: I think we should> > push materialism and reductionism as 
>>> far as we can.
>>> But the inescapable fact remains, I could> > know every empirical fact 
>>> about a conscious
>>> system, but still have no idea what it is actually> > like to *be* that 
>>> system, as it were
>>> from the inside.> > That's commonly said, but is it really true?  Even 
>>> without knowing
>>> anything about another person's > brain you have a lot ideas about what it 
>>> is like to be that
>>> person.  Suppose you really knew a lot > about an artificial brain, as in a 
>>> planetary probe
>>> for example, and you also knew a lot about your > own brain and to you 
>>> could compare
>>> responses both at the behavioural level and at the "brain" level. >   I 
>>> think you could infer
>>> a lot about what it was like to be that probe.  You just couldn't directly 
>>> > experience its
>>> experiences - but that's not surprising.
> 
> 
> You have an idea of what it is like to be another person because you are one 
> yourself. A
> completely alien being might actually know more about how a human brain works 
> than any human,
> even to the extent where he could manufacture a fully working and conscious 
> brain, but he would
> not necessarily have any idea at all about what it is like to be a human 
> unless by accident his
> own mind turned out to be similar to ours - and even then he couldn't be 
> sure. 

That's simply an assumption.  When we know how to make a conscious brain we may 
find that we do have 
a good idea of what it experiences - as evidenced by its self-reports and other 
behavoir.

>On the other hand,
> if you know every empirical fact about a non-conscious entity well enough to 
> make an exact
> working replica, then you know everything there is to know about it. We could 
> define
> consciousness as what is left over when you subtract what can be known about 
> an entity by an
> external observer from what can be known by being that entity yourself.

If there is anything left over.  I don't think it is sufficiently appreciated 
that this 
"unknowability" is an assumption.

Brent Meeker

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to