On 31 August 2014 13:10, meekerdb <[email protected]> wrote:

>  On 8/30/2014 5:54 PM, LizR wrote:
>
>  On 31 August 2014 12:27, meekerdb <[email protected]> wrote:
>
>>  On 8/30/2014 4:04 PM, LizR wrote:
>>
>>   To be absolutely clear - the "Artificial" in AI refers to the machine
>> which hosts the intelligence, not to the intelligence itself.
>>
>>  The problem with machines defeating "Jeopardy" players (I assume this
>> refers to this - http://en.wikipedia.org/wiki/Jeopardy_%28TV_series%29
>> ?) is that the machines concerned almost certainly have no concepts of what
>> the answers were about.
>>
>>
>>  How do you have a concept of what "Who was Charlamagne?" about?  Isn't a
>> lot of of it verbal and relational; stuff Winston does know.  Of course
>> Winston is ignorant about a lot of basic things about being a person
>> because it doesn't have perceptive sensors and the ability to move and
>> manipulate things.
>>
>
>  That's the point. Winston or whatever isn't immersed in an environment,
> or its environment only involves abstract relations. So I do have a better
> idea of who charlemagne was, even if I'd never heard of him before.
>
>  Sure, you have a better idea.  But I don't think that shows that Winston
> has "no concept of what the answers are about."  His concepts are limited
> to verbal relations, but he probably has more of those related to
> Charlemagne than I do.
>

So you appear to think purely abstract relations can be "about something"
even when they have no relation to experience of an environment - is that
correct?

    Hence they aren't in fact "doing what humans do" (or at least not most
humans do, apart from perhaps *idiots savant*). Likewise, Deep Junior
almost certainly has no concept of what it's doing when it scores a 3-3 tie
aganst Kasparov. It has no concept of itself or its opponent, or very
limited "concepts" embedded in relatively small* data structures - and it
experiences no emotions on winning or losing.

  Isn't the reason you think that is because its input/output is so
> limited?  It wouldn't be at all difficult to add to Deep Blue's program so
> that on winning it composed a poem of celebration and displayed fireworks
> on a screen - or even set off real fireworks - and on losing it shut down
> and refused to do anything for three days.
>

 No, I think that because there's no evidence whatsoever that Deep Blue etc
have feelings, at least none that I've come across. I'd be happy to be
proved wrong (which would be a boost for comp, I suppose).

 I'm asking what would constitute evidence for Deep Blue's having
> feelings?  Fireworks and sulking aren't enough?
>

An ongoing exhibition that it did, sustained over a period of time, and
accompanied by what appeared to be the results of mentation, etc - i.e.
passing a Turing test equivalent. Plus supporting evidence that it was
conscious, and that we had reasonable theoretical grounds to think that it
was (e.g. it had had an "electronic childhood" like HAL, etc). Just
displaying a smiley face on a screen by loading in a bitmap wouldn't do it,
for me at least. Given that this would be one of the most profound
discoveries (or inventions) of all time, I'd want some pretty good
evidence. Wouldn't you?

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to