Re: postings of John Rose and Vladimir Nesov below:

I generally agree with both of your postings.  Grounding is a relative
concept.  There are many different degrees of grounding, some much more
powerful than others. Many expert systems had a degree (a relatively low
one) of grounding in their narrow domains.

And grounding can come in many different forms, from many different types
of experience.

With powerful learning algorithms I think you could, as John has
suggested, obtain a significant amount of grounding from reading extremely
large amounts of text.  The read text would constitute a form of
experiences.  There would be a lot of regularities in the aspects of the
world describe by many types of text and many associations, patterns,
situations,and generalizations could be learned from it.  Of course there
probably would be very important gaps in knowledge obtain only in this
way.

So it would be good to have more than just learning from text.

Edward W. Porter
Porter & Associates
24 String Bridge S12
Exeter, NH 03833
(617) 494-1722
Fax (617) 494-1822
[EMAIL PROTECTED]



-----Original Message-----
From: John G. Rose [mailto:[EMAIL PROTECTED]
Sent: Thursday, October 11, 2007 7:10 PM
To: [email protected]
Subject: RE: [agi] Do the inference rules.. P.S.


This is how I "envision" it when a "text only" AGI is fed its world view
as text only. The more text it processes the more a spatial physical
system would emerge internally assuming it is fed text that describes
physical things. Certain relational laws are common across text that
describe and reflect the physical world as we know it. There might be
limits to what the AGI could construct from this information alone but
basic Newtonian physics systems could be constructed. If you fed it more
advanced physics textbooks it should be able to construct Newtonian+
systems - branch out from the basics. It's "handles" to the physical world
would be text based or internally constructed representational entities,
which BTW would be text based i.e. numerical representations in base 256
or base n, binary in physical memory. Theoretically it could construct
bitmap visual scenes, or estimate what they would look like if it was told
to "show" what visual imagery would look like to someone with eyes. It
could figure out what color is, shading, textures, and ultimately 3D space
with motion - depending on the AGI algorithms programmed into it that
is... But if it was not fed enough text containing physical
interrelationships its physics and projected bitmaps would be distorted.
There would have to be enough information in the text or it would have to
be smart enough to derive from minimal information for it to be accurate.

Now naturally it might be better to ground it from the get-go with spatial
physics but for development and testing purposes having it figure that out
would be challenging to build.

John



> From: Vladimir Nesov [mailto:[EMAIL PROTECTED]
> Subject: Re: [agi] Do the inference rules.. P.S.
>
> ...and also why can't 3D world model be just described abstractly, by
> presenting the intelligent agent with bunch of objects with attached
> properties and relations between them that preserve certain
> invariants? Spacial part of world model doesn't seem to be more
> complex than general problem of knowledge arrangement, when you have
> to keep track of all kinds of properties that should (and shouldn't)
> be derived for given scene.
>
> On 10/12/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> > >> spatial perception cannot exist without vision.
> >
> > How does someone who is blind from birth have spatial perception
> > then?
> >
> > Vision is one particular sense that can lead to a 3-dimensional
> > model
> of the
> > world (spatial perception) but there are others (touch &
> > echo-location hearing to name two).
> >
> > Why can't echo-location lead to spatial perception without vision?
> Why
> > can't touch?
> >

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=52675787-15ce5d

Reply via email to