On Sunday, January 12, 2014 10:51:37 AM UTC-5, Bruno Marchal wrote:
> On 12 Jan 2014, at 14:40, Craig Weinberg wrote: 
> > Here then is simpler and more familiar example of how computation   
> > can differ from natural understanding which is not susceptible to   
> > any mereological Systems argument. 
> > 
> > If any of you have use passwords which are based on a pattern of   
> > keystrokes rather than the letters on the keys, you know that you   
> > can enter your password every day without ever knowing what it is   
> > you are typing (something with a #r5f^ in it…?). 
> > 
> > I think this is a good analogy for machine intelligence. By storing   
> > and copying procedures, a pseudo-semantic analysis can be performed,   
> > but it is an instrumental logic that has no way to access the   
> > letters of the ‘human keyboard’. The universal machine’s keyboard is   
> > blank and consists only of theoretical x,y coordinates where keys   
> > would be. No matter how good or sophisticated the machine is, it   
> > will still have no way to understand what the particular keystrokes   
> > "mean" to a person, only how they fit in with whatever set of fixed   
> > possibilities has been defined. 
> > 
> You confuse level of description. 

I think that the existence of a level of description invalidates comp.

> What you say does not distinguish an   
> organic brain from a silicon one. 

Sure, but we to give the organic brain the benefit of the doubt of 
association with consciousness. Since silicon does not naturally seek to 
organize itself as a brain, we should doubt that it is associated with 
human consciousness by default.


> The understanding is not done by the   
> computation in the brain, but by the person having some role in some   
> history, and only manifest itself through some computations (assuming   
> comp). 

I don't see that computations can manifest anything by themselves though.

> > Taking the analogy further, the human keyboard only applies to   
> > public communication. Privately, we have no keys to strike, and   
> > entire paragraphs or books can be represented by a single thought.   
> > Unlike computers, we do not have to build our ideas up from   
> > syntactic digits. 
> > 
> It is the same for computers, once they have developed some relative   
> history. This is well modeled by the "& p" part of the definition of   
> knowing, and the math confirms this. Similarly, no code at all can   
> explain why you feel to be the one in W, instead of the one in M, in   
> the WM-duplication experience. Computers are not just confronted with   
> symbol, but also with truth. 
> > Instead the public-facing computation follows from the experienced   
> > sense of what is to be communicated in general, from the top down,   
> > and the inside out. 
> > 
> OK. But that does not distinguish a carbon brain from a silicon machine. 

The silicon machine is built from the bottom up and the outside in. It 
doesn't develop its own agenda, it only mindlessly executes an alien agenda.


> Bruno 
> http://iridia.ulb.ac.be/~marchal/ 

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to