I think PM is alluding to or otherwise working with something derived from
situation calculus. Looking at the Wikipedia entry and recalling some other
similar kinds of things, you can see just how difficult it would be to get
an AGI program to be able to understand different situations. Because
initial understanding is essentially at the same level of difficulty as
creating a useful or insightful response that means that the situation
calculus was not half a solution to AGI kind of knowledge. The glass wasn't
even quarter full as it turns out.

Most of the time when we generate insightful thoughts around a problem we
are drawing on knowledge about many different things and much of this
knowledge is beneficial even if it does not solve the main problems that we
wish we could solve. Drawing on experience or the memory of experience is
something that many people think  requires highly sophisticated
sensorimotor interactions with the world. I disagree and my disagreement
leads to many implications that I have to wonder about. I think that
sophisticated knowledge can be encoded into text. Then, according to this
point of view, in order to answer questions (or to otherwise derive
insight) about a situation the AGI program would have to be able to derive
that information from its knowledge as was derived from textual
interactions with the world.

Much of this information would be composed of different smaller insights
that had been previously derived. Some of these previously acquired
insights might be refined and expressed as generalizations and so the
combination of simpler insights might be *generated*, in the
computational-theory sense of the term, not just mushed together
individually and refined.

However, this leads to certain questions which are related to some of this
group's predilections. Since generalization is a kind of compression then
am I only talking about distributed compressions?
Well, since generalizations could be combined -by form- and -by role- then
that means that I am talking about a special kind of compression in which
the output could be generated without first decompressing the individuals
components. Or more precisely, that means that I am talking about a special
kind of generalization in which the output of the combinations of
generalized components do not need to first be fully decompressed to be
used.

The potential in this method, which uses both old AI theories and relates
directly to the potential of distributed compression methods seems obvious.
But that does not mean that it is easy to figure out how to get a computer
program to do something like this.

Jim Bromer


On Thu, May 1, 2014 at 4:15 PM, Mike Archbold via AGI <[email protected]>wrote:

> On 5/1/14, Piaget Modeler via AGI <[email protected]> wrote:
> > Okay,
> > Now that we have a fuzzy definition of situations, what do the words
> > "situation induction" mean to you?
> > Please advise.
> > ~PM
> >
> >
>
> Did you acquaint yourself with "situation calculus"?  I think Ben
> alluded to this.
> Mike
>
> > -------------------------------------------
> > AGI
> > Archives: https://www.listbox.com/member/archive/303/=now
> > RSS Feed:
> https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
> > Modify Your Subscription:
> > https://www.listbox.com/member/?&;
> > Powered by Listbox: http://www.listbox.com
> >
>
>
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/24379807-f5817f28
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to