Stan said:
 Using "how one feels immediately" is not a good strategy for making decisions. 
 Any view/cognition needs to be supported by features that are external, or 
logically evident...

 To bring this around to machine choices,..  the arbitrator will likely compare 
two "merit" numbers and choose the higher merit.  Merit is highly abstract - 
that is, it doesn't have a characteristic that tells where it came from...  
Lately I've come to see that we don't have to invent "merit" or have a machine 
to calculate it - we simply need to collect it from various sources that we 
trust...

 ...AGI doesn't have to have mechanisms that "produce" judgments - judgments 
are the fuel aspect of an AGI.  An AGI needs to be a collector and user of 
what's out there."
-----------------------------------------
 
If words had a base meaning, like that of a system built on elemental 
components, then sentences could be easily understood by computers, and the 
meaning of systems of sentences and more complicated ideas could be built on 
those fundamentals.  You have dared to discover that artificial judgement might 
be possible without using merit evaluations but you did not go far enough to 
state that sound judgement must be based on insight.   But if judgement is not 
a system of evaluation then it cannot be reliably acquired by a system of 
evaluation either.  This may seem like sophistry to some people but it seems 
like the central issue to me.
 
The problem is that in a relativistic system, there is always the possibility 
that a reference might have a special meaning in a particular context and that 
meaning might not be derivable through a reasonably simple evaluation.   
Conceptual relativism would indicate that in some cases numerical evaluation 
would work, and this is what the evidence does show.  However, it is not a 
reliable method for all cases.
 
So the complication here is how do you build judgment?  Well it must come from 
a lot of experience right?  That is the presumed basis of good judgement in 
human affairs.  This corresponds to the idea that it takes a lot of knowledge  
to really know one simple thing about a subject.  In the terms of text then, it 
takes a lot of statements about a subject to understand a simple statement of 
the subject.  This is only one small step towards understanding and the 
creation of sound artificial judgement, but it is a step you don't want to miss.
 
Jim Bromer


 
 
 
 
 
 
 
From: Stanley Nilsen <[email protected]>To: [email protected]: Re: 
[agi] Goal SelectionDate: Fri, 17 May 2013 16:32:12 -0600 
Just read a little of the first article.  I immediately see that Thegard and I 
differ in a few important ways.  First, Thegard speaks of emotions as an 
initial reaction and then quickly calls this the "gut reaction."  For me, 
emotion is more of a continuum and the two ends of the spectrum are vastly 
different.  At one end is the "triggered" response which occurs quickly and 
subconsciously.  There isn't much we can do about this kind of emotion except 
to "condition" ones self to react differently.

 The other end of the spectrum is more interesting and equally as valuable (my 
opinion.)  A good example of this "other" type of emotion is what we experience 
and describe as "I just knew it was the right thing."  In other words, we 
"felt" the rightness of a path or choice.  Personally, I wouldn't call that 
emotion, but others consider any "feeling" to be emotion.  If anything, I would 
call this the "intuition" experience. 

 I argue that the "intuition" end of the spectrum is natural and a desirable 
mental state to arrive at.  Remember the explanations as to how Watson came to 
the point of "buzzing in" during the Jeopardy competition?  Watson was able to 
integrate several aspects of the "problem" of fitting an answer to the clues.  
My understanding was that it used something like experts in various categories. 
 The ratings or rankings of these various components were then "summed" into a 
number and if the number was "big enough" then Watson buzzed in.  Could say 
"Watson was confident enough..." or that Watson was "hefty" enough to weigh 
in...  It starts to sound like "Watson felt like buzzing." 

 I believe this is what "deep" human thought is really about - taking the "sum" 
of various ways of looking at the issue and seeing what that net result feels 
like.  (don't confuse me with facts about how mathematicians and artists think  
:-) )  Notice the "feels" like.  

 How else would one deal with a summary of various unrelated aspects?  We 
recognize that some aspects of a car are more important to us than others, but 
we arrive at the point of decision with a "net" feeling - we think the deal is 
right. I suspect that there is also a "feeling" of not accepting - which leads 
to more search and analysis. 

 Using "how one feels immediately" is not a good strategy for making decisions. 
 Any view/cognition needs to be supported by features that are external, or 
logically evident.  Using gut reaction for evidence is giving your ignorance 
too much say so.  Quick gut reaction may aid in helping to steer us into proper 
ways of looking at situation, but it should never be used as "evidence" of our 
rightness. 

 To bring this around to machine choices,..  the arbitrator will likely compare 
two "merit" numbers and choose the higher merit.  Merit is highly abstract - 
that is, it doesn't have a characteristic that tells where it came from.  (not 
saying we can't investigate the process of asserting merit, but the arbitrator 
wouldn't do that.)   Lately I've come to see that we don't have to invent 
"merit" or have a machine to calculate it - we simply need to collect it from 
various sources that we trust.  (an AGI may aspire to calculate merit one day - 
but that is for the mature, not the novice.) 

 My new saying... "I don't expect a design of a car to specify how to create 
gasoline - the gas is fuel for the car, not the "car" invention.  Likewise, an 
AGI doesn't have to have mechanisms that "produce" judgments - judgments are 
the fuel aspect of an AGI.  An AGI needs to be a collector and user of what's 
out there."


 PM Thanks for the reference to Thegard - I'll read more, but his low opinion 
of intuition is hard for me to accept. 

 Stan

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to