On Wed, Apr 24, 2013 at 4:46 AM, Craig Weinberg <whatsons...@gmail.com>wrote:

>
>
> On Wednesday, April 24, 2013 4:31:55 AM UTC-4, Brian Tenneson wrote:
>
>>
>>
>> On Tue, Apr 23, 2013 at 8:53 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>
>>>
>>>
>>> On Tuesday, April 23, 2013 11:37:14 PM UTC-4, Brian Tenneson wrote:
>>>>
>>>> You keep claiming that we understand this and that or know this and
>>>> that.  And, yes, saying something along the lines of "we know we understand
>>>> because we care about what we understand" *is* circular.
>>>
>>>
>>> No, it's not. I'm saying that it is impossible to doubt we understand.
>>> It's just playing with words. My point about caring is that it makes it
>>> clear that we intuitively make a distinction between merely being aware of
>>> something and understanding it.
>>>
>> I'll try to explain how  "we know we understand because we care about
>> what we understand" is circular.  Note the use of the word understand
>> towards the left edge of the statement in quotes followed by another
>> instance of the word understand.
>>
>
> You should read it as "we know we understand because we care about X". My
> only intention in repeating the word was to make it clear that the thing
> that we care about is the thing that we understand. It is the caring which
> is a symptom of understanding. The absence of that symptom of caring in a
> machine indicates to me that there is a lack of understanding. Things which
> understand can care, but things that cannot care cannot understand.
>
> Now that isn't circular but that's a poor sign of understanding.  I care
very much for women but I can't say that I understand them.  I understand
the rules of English grammar and punctuation but care little of it.  I'm
sure you can think of examples.  So the two are not correlated, caring and
understanding.  Caring is not something that can really be measured in
humans while caring can be measured in machines/computers.  For example,
one might define caring about something means it is thinking a lot about
it, where a lot means some threshold like over 50% resources are dedicated
to think about something for a while (a nonzero, finite span of time).
These days, we can multitask and look up the resource monitor to see what
the CPU cares about, if anything.  If it doesn't care about anything, it
uses close to 0% and is called idle.  But if I am running an intensive
computation while typing this and look at my resource monitor, I can see
measurements indicating that my CPU cares much more about the intensive
computation rather than what I am typing.  Does that mean the CPU
understands what it is doing?  No.  Likewise with human brains: we can care
a lot about something but have little to no understanding of it.


>
> This is analogous to saying We are Unicorns because care about Unicorns.
>>
>
> No, this is analogous  to you not understanding what I mean and
> unintentionally making a straw man of my argument.
>

Well, be honest here, you changed a phrasing.  You went from
(paraphrasing)  "we know we understand because we care that we understand"
to "You know we understand because we care about X". Correct me if I'm
wrong.  The first phrasing is meaningless because of the second use of the
word understand (so you might as well be talking about unicorns).  The
first phrasing gives no insight into what understanding is and why we have
it but computers can't.  The problem with your new and improved phrasing is
that it's a doctored definition of caring; you pick a definition related to
understanding such that it (the definition of 'caring') will
*automatically*fail for anything other than a non-apathetic human, in
essence, assuming
computers don't care about anything when, in fact, doing what they are
programmed to do (much like a human, I might add) is the machine-equivalent
of them caring about what they are told to do.


>
> Doesn't prove unicorns exist; doesn't prove understanding exists (i.e.,
>> that any human understands anything). If this is all sophistry then it
>> should be easily dismissible. And yes, playing with words is what people
>> normally do, wittingly or unwittingly, and that lends more evidence to the
>> notion that we are processors in a Chinese room.
>>
>
> The position that we only think we understand or that consciousness is an
> illusion is, in my view, the desperate act of a stubborn mind. Truly, you
> are sawing off the branch that you are sitting on to suggest that we are
> incapable of understanding the very conversation that we are having.
>

Well calling a conclusion the desperate act of a stubborn mind, rather than
supply some decent rejoinder, is also the desperate act of a stubborn mind,
wouldn't you say?  While "sawing off the branch you are sitting on" is a
very clever arrangement of letters (can I use it in a future poem?), it
falls short of being an argument at all or even persuasive. We can get
along just fine by thinking that we understand this conversation.  But
knowing that we understand this conversation?  I'd like to see that
proved.  Until then, I will continue to think that humans or at least the
seat of mind are possibly all in Chinese rooms.


>
>
>>
>>>
>>>> Still doesn't rule out the possibility that we are in a Chinese room
>>>> right now, manipulating symbols without really understanding what's going
>>>> on but able to adeptly shuffle the symbols around fast enough to appear
>>>> functional.
>>>
>>>
>>> Why not? If we were manipulating symbols, why would we care about them.
>>> What you're saying doesn't even make sense. We are having a conversation.
>>> We care about the conversation because we understand it. If I was being
>>> dictated to write in another language instead, I would not care about the
>>> conversation. Are you claiming that there is no difference between having a
>>> conversation in English and dictating text in a language you don't
>>> understand?
>>>
>>
>
>> We care about the symbols because working through the symbols in our
>> brains is what leads to food, shelter, sex, and all the things animals
>> want.
>>
>
> First of all, there are no symbols in our brains, unless you think that
> serotonin or ATP is a symbol. Secondly, the fact that species have needs
> does not imply any sort of caring at all. A car needs fuel and oil but it
> doesn't care about them. When the fuel light comes up on your dashboard,
> that is for you to care about your car, not a sign that the car is anxious.
> Instead of a light on the dashboard, a more intelligently designed car
> could proceed to the filling station and dock at a smart pump, or it could
> use geological measurements and drill out its own petroleum to refine...all
> without the slightest bit of caring or understanding.
>
The electric and chemical footprint of representations of symbols are in
our brains and caring about the symbols is what leads to food, shelter, sex
and all the things animals (not cars) want .


>
>> Or we care about the symbols because they further enrich our lives.
>>
>
> That's circular. Why do we care about enriching our lives? Because we care
> about our lives and richness. We don't have to though in theory, and a
> machine never can.
>
Some people care about enriching their lives, presumably because it
ultimately makes them more satisfied in life.  How do you know what a
machine never can do? They used to say a machine would never fly.
Convictions are prisons.

>
>
>> The symbols in this corner of the internet (barring my contributions of
>> course) are examples of that.  Regarding the world, would you say there is
>> more that we (i.e., at least one human) understand or more that we don't?
>> I would vote 'don't' and that leads me also to suspect we are in a chinese
>> room right now.
>>
>
> I don't know where we are in the extent of our understanding, but there is
> some understanding, while the man in the Chinese room has no understanding.
>
You think there is some understanding because you are really adept at
symbol processing.  Your man in the Chinese room is so convincing, that the
symbols transmitted affirm that you do, in fact, understand.

I think this is one of your theorems: at least one human mind has the
capacity to understand something and does understand something.

Or do you assume that when your inner voice tells you that you understand
something that your inner voice is correct?



>
>
>> Your coupling of caring and understanding is somewhat arbitrary.
>>
>
> No, it is supported by the English language:
> http://dictionary.reverso.net/english-synonyms/understanding
>

I meant in the context of this discussion, the mention of compassion and
caring at best might have something to do with understanding.  I bet that
by the time you reply to this email I could print a list having a million
elements, each of which contains a truth that I think I understand but care
about zero.


> There is no chance of being in a Chinese room at all, because we
> understand some things.
>
That's one possibility.  Another is that we're dumb but fast.

>
>
> Because the Chinese room prohibits us from ever entertaining the
> possibility that we are in the Chinese Room. Just because we are not
> omniscient and omnipotent does not mean that we are senseless and powerless.
>

Why is that?  Being aware of one's present location and even all the
properties of present location does NOT mean understanding one's present
location and understanding all the properties of present location. That's a
good example because we humans are in that situation: we can describe our
environment works but we do not understand all the properties of our
location, we don't even understand why we are here.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to