No, I keep saying - I'm not asking for the odd narrowly-defined task - but 
rather defining CLASSES of specific problems that your/an AGI will be able to 
tackle. Part of the definition task should be to explain how if you can solve 
one kind of problem, then you will be able to solve other distinct kinds.

It's interesting - & I'm not being in any way critical - that this isn't 
getting through.
  ----- Original Message ----- 
  From: Benjamin Goertzel 
  To: agi@v2.listbox.com 
  Sent: Tuesday, May 01, 2007 7:04 PM
  Subject: Re: [agi] The role of incertainty







    Not much point in arguing further here - all I can say now is TRY it - try 
focussing your work the other way round - I'm confident you'll find it makes 
life vastly easier and more productive.  Defining what it does is just as 
essential for the designer as for the consumer.


  Focusing on making systems that can achieve narrowly-defined tasks is EXACTLY 
what the AI field has been doing for the last couple decades. 

  Unsurprisingly, they have had some modest success at making systems that can 
achieve narrowly-defined tasks, and no success at moving toward artificial 
general intelligence.

  -- Ben G

------------------------------------------------------------------------------
  This list is sponsored by AGIRI: http://www.agiri.org/email
  To unsubscribe or change your options, please go to:
  http://v2.listbox.com/member/?&;


------------------------------------------------------------------------------


  No virus found in this incoming message.
  Checked by AVG Free Edition. 
  Version: 7.5.467 / Virus Database: 269.6.2/782 - Release Date: 01/05/2007 
02:10

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to