No, I am not thinking about specific goals.  So you are wrong.  I believe
that every concept is relatively general. So perhaps this is where your
misunderstanding is coming from.

On Sat, Feb 9, 2013 at 5:47 PM, Mike Tintner <[email protected]>wrote:

>   It’s suicidal –in the real world. You won’t be able to adapt to new
> situations  (“sorry Chinese is off tonight”).
>
> You’re thinking in terms of specific goals.AGI is about general goals -
> all living creatures are driven in the first instance by general drives and
> general goals- not specific ones.
>
> Narrow AI is always specific – and your & every other AGI-er’s urges are
> always to be specific – whereas AGI is the opposite.
>
>  *From:* Jim Bromer <[email protected]>
> *Sent:* Saturday, February 09, 2013 10:36 PM
> *To:* AGI <[email protected]>
> *Subject:* Re: [agi] Could Algorithm Generators be a Feasible and
> Effective AGI Method?
>
>  It is not suicidal to set goals and a remark like that shows that you
> are on the wrong track.  Just because the goals that we are most interested
> in may be elusive that does not mean that we cannot use goal strategies to
> help us learn what is achievable and acceptable.  The irony of course is
> that once we discover more realizable goals we usually find that our
> previous goals were not only not realizable but they were really not that
> desirable.
>
> Pursuing unclear goals are not the path you choose.  Evidently, you have
> to decide whether that is a path you would like to take.
>
>
>
> On Fri, Feb 8, 2013 at 10:35 AM, Mike Tintner <[email protected]>wrote:
>
>>   *Jim: So if a goal is clearly definable (for an AGi program)...*
>> **
>> *No goals are clearly definable in AGI – at the program level as opposed
>> to in particular situations. What are the goals of real world activities –
>> like conversation/ reading [science/lit..] books..? You can set goals in
>> particular situations ... “I want Chinese tonight”  But in general, it
>> would be suicidal and isn’t possible.  AGI is about creative machines
>> dealing with a creative world – in which both the machine and the world
>> keep changing, and there is no way of predicting what food (or other goal
>> instantiations) will be available, or what of the vast untried number of
>> cuisines might be worth sampling, or what condition/state your body will be
>> in and therefore what foods it will be attracted to .  The goals of an AGI
>> have to be general, vague and capable of continuous refinement - and new
>> combinations with other goals.*
>>   *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/10561250-470149cf> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>
>   *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/6952829-59a2eca5> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>   *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/10561250-470149cf> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to