Well, solving ANY problem is a little too strong.  This is AGI, not AGH
(artificial godhead), though AGH could be an unintended consequence ;).  So
I would rephrase "solving any problem" as being able to come up with
reasonable approaches and strategies to any problem (just as humans are able
to do).

On Mon, Jul 19, 2010 at 11:32 AM, Mike Tintner <tint...@blueyonder.co.uk>wrote:

>  Whaddya mean by "solve the problem of how to solve problems"? Develop a
> universal approach to solving any problem? Or find a method of solving a
> class of problems? Or what?
>
>  *From:* rob levy <r.p.l...@gmail.com>
> *Sent:* Monday, July 19, 2010 1:26 PM
> *To:* agi <agi@v2.listbox.com>
> *Subject:* Re: [agi] Of definitions and tests of AGI
>
>
>> However, I see that there are no valid definitions of AGI that explain
>> what AGI is generally , and why these tests are indeed AGI. Google - there
>> are v. few defs. of AGI or Strong AI, period.
>>
>
>
> I like Fogel's idea that intelligence is the ability to "solve the problem
> of how to solve problems" in new and changing environments.  I don't think
> Fogel's method accomplishes this, but the goal he expresses seems to be the
> goal of AGI as I understand it.
>
> Rob
>   *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=8660244-6e7fb59c
Powered by Listbox: http://www.listbox.com

Reply via email to