Bob,

I was aware the drum playing shown was pedantic from a robot standpoint,
nothing as sophisticated as the running and dancing Japanese robots.  

But I agree the project is really quite ambitious in that it is trying to
create an embodied robot with a real AGI for a brain.  

It may well make major contributions to AGI.  

Perhaps OpenCog should seek some sort of cross fertilization with the
RobotClub people.

Ed Porter

-----Original Message-----
From: Bob Mottram [mailto:[EMAIL PROTECTED] 
Sent: Monday, April 28, 2008 6:09 PM
To: [email protected]
Subject: Re: [agi] An interesting project on embodied AGI

Incidentally this is also an open source robot.
http://eris.liralab.it/wiki/RobotCubSoftware

Mechanically sophisticated humanoids have a long history.  What's
interesting about these is not how much money is spent or how many
axes are actuated but the sophistication of the software and
particularly the ability of the robot to perceive its world in some
meaningful way.  It's easy to make it apparently play the drums, but
this is really nothing more than Jacques de Vaucanson's automata were
doing in the 18th century (i.e. not much more than
puppetry/animatronics).



2008/4/28 Ed Porter <[EMAIL PROTECTED]>:
>
>
>
>
> Josh,
>
> Thanks for your link.  I read the article.  It is quite interesting.  I
got
> some more info on the $9.7 million dollar RobotClub project below of which
> iCub is the physical embodiment part.
>
> Ed porter
>
> ==============================
>
>
>
> An EU-funded project is to attempt to educate a humanoid robot called
'iCub'
> which, at a metre tall, is the same size as a three-year old toddler and
is
> able to crawl, sit up, feel, see and hear. The four-year Integration and
> Transfer of Action and Language Knowledge in Robots (ITALK) project will
> teach the so-called 'toddlerbot' how to develop cognitive skills, using
> processes similar to the ways parents develop these skills in their
> children. Led by the University of Plymouth, the consortium received an EU
> grant of EUR 6.2 million [ewp: equals $9.7Million] to carry out their
> project.
>
>
>
> The project is expected to bring cognitive robotics research closer to the
> development of humanoid robots who can think, act and talk like human
> beings. An important skill here would be the development of 'learning' and
> updating available knowledge with the help of stimuli from an external
> environment.
>
>  To begin with, iCub will learn simple activities such as fitting objects
of
> different shapes into their correct slots, nesting objects of different
> sizes, and stacking blocks - in short, activities that help infants
develop
> certain cognitive skills. The next step would be to teach it how to name
> objects and describe actions such as 'robot puts stick on cube.
>
>  Scientists will also work on developing iCub's language skills in
> association with language researchers, who have studied how parents help
> their children learn to talk. Researchers aim to imbibe language skills in
> iCub, so that the robot can teach itself how to talk.
>
>  Whatever the robot learns individually and socially should help to
develop
> its language skills, which in turn, should help iCub interact better with
> its environment and pick up more knowledge to learn more. Knowledge of
> grammar and vocabulary is also likely to emerge naturally through this
> process.
>
>  'Our approach is that robot will use what it learns individually and
> socially from others to bootstrap the acquisition of language, and will
use
> its language abilities in turn to drive its learning of social and
> manipulative abilities,' says Professor Chrystopher Nehaniv from the
> University of Hertfordshire's School of Computer Science, one of the
project
> partners. 'This creates a positive feedback cycle between using language
and
> developing other cognitive abilities. Like a child learning by imitation
of
> its parents and interacting with the environment around it, the robot will
> master basic principles of structured grammar, like negation, by using
these
> abilities in context.'
>
>  The scientific and technological research developed during the project is
> expected to have a significant impact on the future generation of
> interactive robotic systems within the next ten years and the leadership
> role of Europe in this area. 'iCub will take us a stage forward in
> developing robots as social companions. We have studied issues such as how
> robots should look and how close people will want them to approach and
now,
> within a year, we will have the first humanoid robot capable to developing
> language skills,' says Professor Kerstin Dautenhahn, who is also from the
> University of Hertfordshire's School of Computer Science.
>
>
>
>
>
> The above text comes from
> http://ec.europa.eu/research/headlines/news/article_08_03_20_en.html
>
> Some more links re iCub
>
> (1) iCub drumming,
>
>
http://www.robotcub.org/index.php/robotcub/content/download/1135/3982/file/i
cubFullDrumming3.wmv
>
> (2) high quality picture of icub
>
>
http://www.robotcub.org/index.php/robotcub/content/download/1131/3970/file/D
SC_3994.jpg
>
> (2) PDF describing the robot in detail and near its end giving a brief
> description of some of the software, including open source software they
are
> using.
>
> http://www.robotcub.org/misc/review3/05_Metta_et_al.pdf
>
>
>
>
>
>
>
>
>
>
> -----Original Message-----
>  From: J Storrs Hall, PhD [mailto:[EMAIL PROTECTED]
>  Sent: Monday, April 28, 2008 2:27 PM
>  To: [email protected]
>  Subject: Re: [agi] An interesting project on embodied AGI
>
>
>
> I drool over the physical robot -- it's built like a brick outhouse. It
has
> 53
>
> degrees of freedom, binocular vision, touch, audition, and inertial
sensors,
>
> harmonic drives, top-grade aircraft aluminum members, the works.
>
>
>
> That doofy face places it squarely in the deepest ravine of the uncanny
>
> valley.
>
>
>
> They talk a very good game in the AGI approach they're taking, for my
money:
>
> see http://www.robotcub.org/misc/review3/07_Vernon_Metta_Sandini_ICDL.pdf
>
>
>
> Josh
>
>
>
> On Monday 28 April 2008 10:13:48 am, Ed Porter wrote:
>
> > For an article on an interesting project on embodied AGI read
>
> > "Next Step In Robot Development Is Child's Play" at
>
> > http://www.sciencedaily.com/releases/2008/04/080421162240.htm
>
> >
>
>
>
> -------------------------------------------
>
> agi
>
> Archives: http://www.listbox.com/member/archive/303/=now
>
> RSS Feed: http://www.listbox.com/member/archive/rss/303/
>
> Modify Your Subscription: http://www.listbox.com/member/?&;
>
>
> Powered by Listbox: http://www.listbox.com
>  ________________________________
>
>  agi | Archives | Modify Your Subscription

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?&;
Powered by Listbox: http://www.listbox.com

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com

Reply via email to