On Wed, Apr 3, 2013 at 9:54 PM, Craig Weinberg <whatsons...@gmail.com>wrote:

>
>
> On Wednesday, April 3, 2013 8:58:37 PM UTC-4, Jason wrote:
>
>>
>>
>>
>> On Wed, Apr 3, 2013 at 6:04 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>
>>>
>>>
>>> On Wednesday, April 3, 2013 5:44:24 PM UTC-4, Jason wrote:
>>>>
>>>>
>>>>
>>>>
>>>> On Sat, Mar 30, 2013 at 7:58 AM, Telmo Menezes 
>>>> <te...@telmomenezes.com>wrote:
>>>>
>>>>
>>>>>
>>>>>
>>>>> On Thu, Mar 28, 2013 at 1:23 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>>>>
>>>>>>
>>>>>>
>>>>>> Then shouldn't a powerful computer be able to quickly deduce the
>>>>>> winning Arimaa mappings?
>>>>>>
>>>>>
>>>>> You're making the same mistake as John Clark, confusing the physical
>>>>> computer with the algorithm. Powerful computers don't help us if we don't
>>>>> have the right algorithm. The central mystery of AI, in my opinion, is why
>>>>> on earth haven't we found a general learning algorithm yet. Either it's 
>>>>> too
>>>>> complex for our monkey brains, or you're right that computation is not the
>>>>> whole story. I believe in the former, but not I'm not sure, of course.
>>>>> Notice that I'm talking about generic intelligence, not consciousness,
>>>>> which I strongly believe to be two distinct phenomena.
>>>>>
>>>>>
>>>>
>>>> Another point toward Telmo's suspicion that learning is complex:
>>>>
>>>> If learning and thinking intelligently at a human level were
>>>> computationally easy, biology wouldn't have evolved to use trillions of
>>>> synapses.  The brain is very expensive metabolically (using 20 - 25% of the
>>>> total body's energy, about 100 Watts).  If so many neurons were not needed
>>>> to do what we do, natural selection would have selected those humans with
>>>> fewer neurons and reduced food requirements.
>>>>
>>>
>>> There's no question that human intelligence reflects an improved
>>> survival through learning, and that that is what makes the physiological
>>> investment pay off.
>>>
>>
>> Right, so my point is that we should not expect things like human
>> intelligence or human learning to be trivial or easy to get in robots, when
>> the human brain is the most complex thing we know, and can perform more
>> computations than even the largest super computers of today.
>>
>
> Absolutely, but neither should we expect that complexity alone
>

I don't think anyone has argued that complexity alone is sufficient.



> can make an assembly of inorganic parts into a subjective experience which
> compares to that of an animal.
>

Both are made of the same four fundamental forces interacting with each
other, why should the number of protons in the nucleus of some atoms in
those organic molecules make any difference to the subject?  What led you
to chose the chemical elements as the origin of sense and feeling, as
opposed to higher level structures (neurology, circuits, etc.) or lower
level structures (quarks, gluons, electrons)?



>
>
>>
>>
>>> What I question is why that improvement would entail awareness.
>>>
>>
>> A human has to be aware to do the things it does, because zombies are not
>> possible.
>>
>
> That's begging the question.
>

Not quite, I provided an argument for my reasoning.  What is your
objection, that zombies are possible, or that zombies are not possible but
that doesn't mean something that in all ways appears conscious must be
conscious?



> Anything that is not exactly what it we might assume it is would be a
> 'zombie' to some extent. A human does not have to be aware to do the things
> that it does, which is proved by blindsight, sleepwalking, brainwashing,
> etc. A human may, in reality, have to be aware to perform all of the
> functions that we do, but if comp were true, that would not be the case.
>
>
>>   Your examples of blind sight are not a disproof of the separability of
>> function and awareness,
>>
>
> I understand why you think that, but ultimately it is proof of exactly
> that.
>
>
>> only examples of broken links in communication (quite similar to split
>> brain patients).
>>
>
> A broken link in communication which prevents you from being aware of the
> experience which is informing you is the same thing as function being
> separate from awareness. The end result is that it is not necessary to
> experience any conscious qualia to receive optical information. There is no
> difference functionally between a "broken link in communication" and
> "separability of function and awareness". The awareness is broken in the
> dead link, but the function is retained, thus they are in fact separate.
>

So you take the split brain patient's word for it that he didn't see the
word "PAN" flashed on the screen
http://www.youtube.com/watch?v=ZMLzP1VCANo&t=1m50s

Perhaps his left hemisphere didn't see it, but his right hemisphere
certainly did, as his right hemisphere is able to draw a picture of that
pan (something in his brain saw it).

I can't experience life through your eyes right now because our brains are
disconnected, should you take my word for my assertion that you must not be
experiencing anything because the I in Jason's skull doesn't experience any
visual stimulus from Craig's eyes?



>
>
>>
>>> There are a lot of neurons in our gut as well, and assimilation of
>>> nutrients is undoubtedly complex and important to survival, yet we are not
>>> compelled to insist that there must be some conscious experience to manage
>>> that intelligence. Learning is complex, but awareness itself is simple.
>>>
>>
>> I think the nerves in the gut can manifest as awareness, such as cravings
>> for certain foods when the body realizes it is deficient in some particular
>> nutrient.  Afterall, what is the point of all those nerves if they have no
>> impact on behavior?
>>
>
> Oh I agree, because my view is panexperiential. The gut doesn't have the
> kind of awareness that a human being has as a whole, because the other
> organs of the body are not as significant as the brain is to the organism.
> If we are going by the comp assumption though, then there is an implication
> that nothing has any awareness unless it is running a very sophisticated
> program.
>
>
I don't think that belief is universal.  Bruno thinks even simple Turing
machines are conscious, and many people debate whether thermostats can be
considered at some level conscious.  I am partial to the idea that
awareness of even a single bit of information represents an atom of
awareness.  But I, unlike you, think that complex sensations require
complex representations.  I think we agree there is no set complexity
threshold where the magic of consciousness begins.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to