On Tuesday, April 23, 2013 11:37:14 PM UTC-4, Brian Tenneson wrote:
>
> You keep claiming that we understand this and that or know this and that.  
> And, yes, saying something along the lines of "we know we understand 
> because we care about what we understand" *is* circular.  


No, it's not. I'm saying that it is impossible to doubt we understand. It's 
just playing with words. My point about caring is that it makes it clear 
that we intuitively make a distinction between merely being aware of 
something and understanding it.
 

> Still doesn't rule out the possibility that we are in a Chinese room right 
> now, manipulating symbols without really understanding what's going on but 
> able to adeptly shuffle the symbols around fast enough to appear 
> functional. 


Why not? If we were manipulating symbols, why would we care about them. 
What you're saying doesn't even make sense. We are having a conversation. 
We care about the conversation because we understand it. If I was being 
dictated to write in another language instead, I would not care about the 
conversation. Are you claiming that there is no difference between having a 
conversation in English and dictating text in a language you don't 
understand?
 

> If that is the case, AI might be able to replicate human behavior if human 
> behavior is all computation-based.
>

Yes and no. Human behavior can never be generic. The more generic it is, 
the more inhuman it is. AI could imitate a particular person's behavior and 
fool X% of a given audience, but because human behavior is ultimately 
driven by proprietary preferences, there will probably always be some ratio 
of audience size to duration of exposure which will wind up with a positive 
detection of simulation. The threshold may be much lower than it seems. 
Judging from existing simulation, it may not always be possible to 
determine absolutely that something is a simulation, but I would be willing 
to bet that some part of the brain lights up differently when presented 
with a simulated presentation vs a genuine one.

Craig
 

>
> On Tue, Apr 23, 2013 at 8:25 PM, Craig Weinberg 
> <whats...@gmail.com<javascript:>
> > wrote:
>
>>
>>
>> On Tuesday, April 23, 2013 7:59:26 PM UTC-4, Brian Tenneson wrote:
>>
>>>
>>>
>>> On Tue, Apr 23, 2013 at 3:13 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>>
>>>>
>>>>
>>>> On Tuesday, April 23, 2013 4:31:05 PM UTC-4, Brian Tenneson wrote:
>>>>
>>>>>
>>>>>
>>>>> On Tue, Apr 23, 2013 at 1:26 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>>>>
>>>>>>
>>>>>>
>>>>>> Searle wasn't wrong. The whole point of the Chinese Room is to point 
>>>>>> out that computation is a disconnected, anesthetic function which is 
>>>>>> accomplished with no need for understanding of larger contexts. 
>>>>>>
>>>>>>
>>>>>
>>>>> How do we know that what humans do is understand things rather than 
>>>>> just compute things? 
>>>>>
>>>>  
>>>>
>>>> Because we care about what we understand, and we identify with it 
>>>> personally.  Understanding is used also to mean compassion. When someone 
>>>> demonstrates a lack of human understanding, we say that they are behaving 
>>>> robotically, like a machine, etc. Questions like, "How do you know you are 
>>>> conscious?", or "How do you know that you feel?" are sophistry. How do you 
>>>> know that you can ask that question?
>>>>
>>>>
>>> Sounds circular. "we do understand things because we care about what we 
>>> understand."  The type of understanding I was referring to was not about 
>>> compassion.  Why is it so strange to think that we are stuck in a big 
>>> Chinese room, without really understanding anything but being adept at 
>>> pushing symbols around? 
>>>
>>
>> It's not circular, I was trying to be clear about the difference between 
>> computation and understanding. Computation is variations on the theme of 
>> counting, but counting does not help us understand. A dog might be able to 
>> count how many times we speak a command, and we can train them to respond 
>> to the third instance we speak it, but we can use any command to associate 
>> with the action of sitting or begging. We are not in a Chinese room because 
>> we know what kinds of things the word 'sit' actually might refer to. We 
>> know what kind of context it relates to, and we understand what our options 
>> for interpretation and participation are. The dog has no options. It can 
>> follow the conditioned response and get the reward, or it can fail to do 
>> that. It doesn't know what else to do. 
>>
>> Craig
>>
>> -- 
>> You received this message because you are subscribed to a topic in the 
>> Google Groups "Everything List" group.
>> To unsubscribe from this topic, visit 
>> https://groups.google.com/d/topic/everything-list/bY0TNHtwNh8/unsubscribe?hl=en
>> .
>> To unsubscribe from this group and all its topics, send an email to 
>> everything-li...@googlegroups.com <javascript:>.
>> To post to this group, send email to everyth...@googlegroups.com<javascript:>
>> .
>> Visit this group at http://groups.google.com/group/everything-list?hl=en.
>> For more options, visit https://groups.google.com/groups/opt_out.
>>  
>>  
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to