On Thu, Feb 7, 2013 at 7:12 PM, meekerdb <[email protected]> wrote:

>  On 2/7/2013 3:52 AM, Telmo Menezes wrote:
>
>  On Wed, Feb 6, 2013 at 7:04 PM, John Clark <[email protected]> wrote:
>
>>
>>
>>  On Tue, Feb 5, 2013 at 6:00 PM, Telmo Menezes <[email protected]>wrote:
>>
>>        >>> I'm not claiming that intelligence == mind.
>>>>>
>>>>
>>>> > Do you believe that your fellow human beings have minds? If so why?
>>>>
>>>
>>>   > Yes (weakly).
>>>
>>
>> You believe that only weakly?! Do you really think there is a 49% chance
>> that you are the only conscious being in the universe?
>>
>
>  I don't know how to assign a probability to that. I guess I believe it's
> in ]0.5, 1] because I would bet on it, but that's all I can say.
>
>  I say weakly because the only thing I have to back this belief is an
> heuristic, which I find to be a weaker form of approximating the truth than
> mathematical proof or experimental confirmation.
>
>
>>  By the way, I don't believe other people have minds when they are
>> sleeping or under anesthesia or dead because when they are in those states
>> they don't behave very intelligently.
>>
>
>  But that is because you believe that intelligence == mind. I don't.
> Certain experiences that you can do on yourself might make you doubt that
> belief, but I don't know of any way to convince you except suggesting that
> you do those experiences.
>
>
>>
>>    > Occam's razor. If I'm the only human being with a mind, then, for
>>> some mysterious reason, there are two types of human beings: me (with a
>>> mind) and the others (zombies). So heuristically I'm inclined to believe
>>> that all human beings have a mind,
>>>
>>
>> OK, but if you also believe in Darwin's theory of Evolution then you must
>> also believe that consciousness MUST be a byproduct of intelligence because
>> Evolution can't directly see consciousness any better than we can and so
>> cannot select for it, and yet you and probably other people are conscious.
>> Thus you must also believe that if a computer is intelligent then it is
>> conscious. Then you must also believe that intelligence == mind.
>>
>
>  You are begging the question. You're assuming, to begin with, that
> intelligence == mind and then you claim to prove that intelligence == mind.
>
>  By the way, for evolution to generate consciousness there has to exist a
> gradient to climb. Unless the evolutionary process just stumbles into
> consciousness, but in that case it is not a valid theory of it's origin. So
> you are implicitly assuming that there is some measure of consciousness,
> where you can say that entity A is more conscious than entity B. What would
> that even mean? My cat seems conscious to me (but I can't know for sure).
> Is he less conscious than me? Well I know stuff that he doesn't, but he
> also knows stuff that I don't -- for example he knows how it feels to be a
> cat.
>
>
> But that doesn't mean there's something magic about being a cat.  I think
> it might be possible to change your brain, and your sensory organs, so that
> it implemented consciousness very similar to a cat's (it couldn't be exact
> because you'd need a cat's body for that).  Of course it wouldn't be Telmo
> Menezes any more.
>

I agree that this might be possible. But the paradox then is the following:
to make me feel like a cat you have to strip me of my memories (read/write
access), so when I'm back from the experience I won't remember it. In fact
I turned into a cat for a while and then back to Telmo Menezes. Telmo
Menezes still knows nothing about being a cat.


>
> And yes I think there are degrees and kinds of consciousness and that a
> cat's consciousness differs in both respects.  There's consciousness of
> being an individual and of being located in 3-space and in time.  You and
> the cat have both of those (whereas a Mars rover only has the latter).  But
> there's language and narrative memory that you have and the cat doesn't.
> There's reflective thought,"I'm Telmo and I'm thinking about myself and
> where I fit in the world".  The cat probably doesn't have this because it's
> not social - but a dog might.
>

But is this really a case of "degrees of consciousness" or is it just the
general property of "being conscious" instantiated in different contexts?
The fact that you believe you can turn me into a cat seems to indicate that
ultimately you believe that consciousness is all the same.


>
> Brent
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/everything-list?hl=en.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to