________________________________
 From: John Clark <johnkcl...@gmail.com>
To: everything-list@googlegroups.com 
Sent: Tuesday, August 27, 2013 10:08 AM
Subject: Re: When will a computer pass the Turing Test?
  


On Mon, Aug 26, 2013  Chris de Morsella <cdemorse...@yahoo.com> wrote:


 
> you cannot prove that things in the brain happen because of some proximate 
> definable and identifiable cause or otherwise they must therefore result by a 
> completely random process. 

>>Bullshit. Axioms don't need proof, and the most fundamental axiom in all of 
>>logic is that X is Y or X is not Y.  Everything else is built on top of that. 
>> And only somebody who was absolutely desperate to prove the innate 
>>superiority of humans over computers would try to deny it.

You seem confused... the brain is not an axiom... it is one of the most complex 
systems we know about in the observed universe.  

> In a system as layered, massively parallel and highly noisy as the brain your 
> assumptions of how it works are naïve and border on the comical. The brain is 
> not a based on a simple deterministic algorithm in which the chain of cause 
> and effect is always clear. 

>> Although reductionism has recently received a lot of bad press from 
>> supermarket tabloids and new age gurus the fact remains that if you want to 
>> study something complex you've got to break it into simpler parts and then 
>> see how the parts fit together. And in the final analysis things happen for 
>> a reason or they don't happen for a reason; and if they did then it's 
>> deterministic and if they didn't then it's random.   

Perhaps your final analysis is a bit too shallow and self limiting. Why you 
cling so tenaciously to this need for definitive causality chains (or else it 
must be complete randomness) is amusing, but is not misguided. You cannot show 
definitive causality for most of what goes on in most of the universe. You can 
hypothesize a causal relationship perhaps, but you cannot prove one for all 
manner of phenomenon arising out of chaotic systems. The brain is a noisy 
chaotic system and you are attempting to impose your Newtonian order on it. 

Your approach does not map well onto the problem domain. And what you say has 
no predictive value; it does not help unravel how the brain works... or how the 
mind arises within it.

 
> You can copy the symbols on a sheet of paper , but without understanding 
> Hungarian you will never be impacted by the meaning or sensations that poem 
> is seeking to convey. 
>

>>True but irrelevant. I never claimed we would someday understand how to make 
>>an AI more intelligent than ourselves, I only said that someday such an AI 
>>would get made.
 
And how are you sure it has not already been achieved. To go by some of the 
recent DARPA solicitations they are really hot on the trail of trying to 
develop/discover smart algorithms modeled on the neocortext's own algorithms -- 
especially in the area of pattern matching.
 
What I said about needing to understand that which you are studying in order to 
be able to really be able to manipulate, extend, emulate, simulate etc. is not 
only true  -- as you admit -- but is also relevant. With no understanding of 
the symbol stream you have no knowledge of what to do with the symbol stream 
passing across your view; you are unable to operate with it in any kind of 
meaningful manner. It is like looking at DNA sequences flashing by you... 
ACTG... with no insight into what they symbols mean, do or control. 
 
As I earlier agreed -- black box testing has its place and it is possible to 
discover some aspects of a system through its external interface, but to really 
know a system and to be able to describe it one must open it up and actually 
study it. A white box methodology is required. 
 
This applies to understanding the brain as well.. it is and will remain a 
mystery until we go in and figure out its fine grained workings. 

-Chris

  >> John K Clark

 

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to