So for example, I could apply rules of thumb to a situation without fully
discerning the details of the situation. If someone becomes annoyed with my
remarks I don't need to know the details of why he became annoyed before I
can start generating possible explanations. If the annoyed person provides
some reason why he thinks my remarks are way off I can then start to derive
slightly more sophisticated explanations about what has annoyed him.
However, what I am saying is that these more sophisticated remarks are not
based on full descriptions of what is motivating the complaints but
are simply more rules of thumb that are based on this further knowledge The
rules of thumb could be generated by combinations of generalizations. For
instance he is annoyed because he doesn't understand what I am saying, or
he is annoyed because my theories make so much sense that they seem like I
am only stating the obvious. Or, at a slightly more sophisticated level of
analysis, he disagrees with me because he thinks that computer driven
generalizations are not powerful enough to act in the way I am suggesting
that they could be made to act.  These rules of thumb may seem
simplistic but they are derived from the analysis of similar situations and
may represent some powerful insights. The only question is whether these
insights are actually relevant to the particular case. But what I am trying
to point out is that I do not have to fully decompress the situation (I do
not need to use my full powers of analysis of the situation) to begin
generating superficial explanations.

Jim Bromer


On Sat, May 3, 2014 at 1:47 PM, Jim Bromer <[email protected]> wrote:

> Or more precisely, that means that I am talking about a system of
> generalizations in which the output derived from the combinations of
> generalized components do not need to first be fully decompressed relative
> to the individuals of the situation.
>
> Jim Bromer
>
>
> On Sat, May 3, 2014 at 1:42 PM, Jim Bromer <[email protected]> wrote:
>
>> I think PM is alluding to or otherwise working with something derived
>> from situation calculus. Looking at the Wikipedia entry and recalling some
>> other similar kinds of things, you can see just how difficult it would be
>> to get an AGI program to be able to understand different situations.
>> Because initial understanding is essentially at the same level of
>> difficulty as creating a useful or insightful response that means that the
>> situation calculus was not half a solution to AGI kind of knowledge. The
>> glass wasn't even quarter full as it turns out.
>>
>> Most of the time when we generate insightful thoughts around a problem we
>> are drawing on knowledge about many different things and much of this
>> knowledge is beneficial even if it does not solve the main problems that we
>> wish we could solve. Drawing on experience or the memory of experience is
>> something that many people think  requires highly sophisticated
>> sensorimotor interactions with the world. I disagree and my disagreement
>> leads to many implications that I have to wonder about. I think that
>> sophisticated knowledge can be encoded into text. Then, according to this
>> point of view, in order to answer questions (or to otherwise derive
>> insight) about a situation the AGI program would have to be able to derive
>> that information from its knowledge as was derived from textual
>> interactions with the world.
>>
>> Much of this information would be composed of different smaller insights
>> that had been previously derived. Some of these previously acquired
>> insights might be refined and expressed as generalizations and so the
>> combination of simpler insights might be *generated*, in the
>> computational-theory sense of the term, not just mushed together
>> individually and refined.
>>
>> However, this leads to certain questions which are related to some of
>> this group's predilections. Since generalization is a kind of compression
>> then am I only talking about distributed compressions?
>> Well, since generalizations could be combined -by form- and -by role- then
>> that means that I am talking about a special kind of compression in which
>> the output could be generated without first decompressing the individuals
>> components. Or more precisely, that means that I am talking about a special
>> kind of generalization in which the output of the combinations of
>> generalized components do not need to first be fully decompressed to be
>> used.
>>
>> The potential in this method, which uses both old AI theories and relates
>> directly to the potential of distributed compression methods seems obvious.
>> But that does not mean that it is easy to figure out how to get a computer
>> program to do something like this.
>>
>> Jim Bromer
>>
>>
>> On Thu, May 1, 2014 at 4:15 PM, Mike Archbold via AGI 
>> <[email protected]>wrote:
>>
>>> On 5/1/14, Piaget Modeler via AGI <[email protected]> wrote:
>>> > Okay,
>>> > Now that we have a fuzzy definition of situations, what do the words
>>> > "situation induction" mean to you?
>>> > Please advise.
>>> > ~PM
>>> >
>>> >
>>>
>>> Did you acquaint yourself with "situation calculus"?  I think Ben
>>> alluded to this.
>>> Mike
>>>
>>> > -------------------------------------------
>>> > AGI
>>> > Archives: https://www.listbox.com/member/archive/303/=now
>>> > RSS Feed:
>>> https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
>>> > Modify Your Subscription:
>>> > https://www.listbox.com/member/?&;
>>> > Powered by Listbox: http://www.listbox.com
>>> >
>>>
>>>
>>> -------------------------------------------
>>> AGI
>>> Archives: https://www.listbox.com/member/archive/303/=now
>>> RSS Feed:
>>> https://www.listbox.com/member/archive/rss/303/24379807-f5817f28
>>> Modify Your Subscription:
>>> https://www.listbox.com/member/?&;
>>> Powered by Listbox: http://www.listbox.com
>>>
>>
>>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to