On Wednesday, April 3, 2013 8:58:37 PM UTC-4, Jason wrote:
>
>
>
>
> On Wed, Apr 3, 2013 at 6:04 PM, Craig Weinberg 
> <whats...@gmail.com<javascript:>
> > wrote:
>
>>
>>
>> On Wednesday, April 3, 2013 5:44:24 PM UTC-4, Jason wrote:
>>>
>>>
>>>
>>>
>>> On Sat, Mar 30, 2013 at 7:58 AM, Telmo Menezes 
>>> <te...@telmomenezes.com>wrote:
>>>
>>>
>>>>
>>>>
>>>> On Thu, Mar 28, 2013 at 1:23 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>>>
>>>>>
>>>>>
>>>>> Then shouldn't a powerful computer be able to quickly deduce the 
>>>>> winning Arimaa mappings?
>>>>>
>>>>
>>>> You're making the same mistake as John Clark, confusing the physical 
>>>> computer with the algorithm. Powerful computers don't help us if we don't 
>>>> have the right algorithm. The central mystery of AI, in my opinion, is why 
>>>> on earth haven't we found a general learning algorithm yet. Either it's 
>>>> too 
>>>> complex for our monkey brains, or you're right that computation is not the 
>>>> whole story. I believe in the former, but not I'm not sure, of course. 
>>>> Notice that I'm talking about generic intelligence, not consciousness, 
>>>> which I strongly believe to be two distinct phenomena.
>>>>   
>>>>
>>>
>>> Another point toward Telmo's suspicion that learning is complex:
>>>
>>> If learning and thinking intelligently at a human level were 
>>> computationally easy, biology wouldn't have evolved to use trillions of 
>>> synapses.  The brain is very expensive metabolically (using 20 - 25% of the 
>>> total body's energy, about 100 Watts).  If so many neurons were not needed 
>>> to do what we do, natural selection would have selected those humans with 
>>> fewer neurons and reduced food requirements.
>>>
>>
>> There's no question that human intelligence reflects an improved survival 
>> through learning, and that that is what makes the physiological investment 
>> pay off.
>>
>
> Right, so my point is that we should not expect things like human 
> intelligence or human learning to be trivial or easy to get in robots, when 
> the human brain is the most complex thing we know, and can perform more 
> computations than even the largest super computers of today.
>

Absolutely, but neither should we expect that complexity alone can make an 
assembly of inorganic parts into a subjective experience which compares to 
that of an animal. 


>  
>
>> What I question is why that improvement would entail awareness.
>>
>
> A human has to be aware to do the things it does, because zombies are not 
> possible.
>

That's begging the question. Anything that is not exactly what it we might 
assume it is would be a 'zombie' to some extent. A human does not have to 
be aware to do the things that it does, which is proved by blindsight, 
sleepwalking, brainwashing, etc. A human may, in reality, have to be aware 
to perform all of the functions that we do, but if comp were true, that 
would not be the case.
 

>   Your examples of blind sight are not a disproof of the separability of 
> function and awareness,
>

I understand why you think that, but ultimately it is proof of exactly that.
 

> only examples of broken links in communication (quite similar to split 
> brain patients).
>

A broken link in communication which prevents you from being aware of the 
experience which is informing you is the same thing as function being 
separate from awareness. The end result is that it is not necessary to 
experience any conscious qualia to receive optical information. There is no 
difference functionally between a "broken link in communication" and 
"separability of function and awareness". The awareness is broken in the 
dead link, but the function is retained, thus they are in fact separate.

 
>
>> There are a lot of neurons in our gut as well, and assimilation of 
>> nutrients is undoubtedly complex and important to survival, yet we are not 
>> compelled to insist that there must be some conscious experience to manage 
>> that intelligence. Learning is complex, but awareness itself is simple.
>>
>
> I think the nerves in the gut can manifest as awareness, such as cravings 
> for certain foods when the body realizes it is deficient in some particular 
> nutrient.  Afterall, what is the point of all those nerves if they have no 
> impact on behavior?
>

Oh I agree, because my view is panexperiential. The gut doesn't have the 
kind of awareness that a human being has as a whole, because the other 
organs of the body are not as significant as the brain is to the organism. 
If we are going by the comp assumption though, then there is an implication 
that nothing has any awareness unless it is running a very sophisticated 
program.


Craig
 

>
> Jason
>  
>  
>  
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to