On Mon, Feb 3, 2014 at 7:50 PM, Samantha Atkins <[email protected]> wrote:

> Experience, from inductive leap, act on its basis giving more experience.
> If the results are more or less as the leap expected then keep it, if not
> then modify or discard it.
>
> Rinse, lather, repeat.   I don't see why that needs to look like more
> random trial and error.  Am I missing something?
>

My impression is that you are missing something, but it may not be
important from your perspective. The effort to interpret something in the
terms of previously developed abstractions (and simplifications) seems like
a normal part of intelligence. However, wise people are aware that they
might be missing something if they lean too heavily on this process.

Trial and error does not refer to total randomness. In fact, the concept of
'total randomness' only makes sense as an imaginary (philosophical)
construct. Just as randomness is defined within the confines of regularity,
the concept of trial and error is defined within the confines of an
expected stability of some kind of background knowledge.  So, for
example, if I find that it is not easy to talk to someone, I can try to
simplify my comments to see if that makes it easier to talk to him. Since I
don't know if that will work that refinement is part of a trial and error
process.

The importance of structural relations is not something that just refers to
the underlying structure of a pre-programmed system (like logic) but it
also can be found in learned and acquired relations. Structural learning is
significant because it can act like an amplification of incremental
learning. The idea that I am getting at is not something new in education.
However, I did not know what to call it so I used the term structural
knowledge.  But, this theory has hit this community with a silent thud so I
suspect that few other people see the potential significance to AI/AGI that
I see. And my conclusion, based on your two paragraph message is that
this must be something that you either missed or just totally disagree with.

At certain points the structure of knowledge (about some issue) can become
strong enough that it suddenly can support and explain a number of
questions that the student would not have been able to previously answer.
This is simply the theory that we can build knowledge on previously learned
knowledge. What I am saying however, is that the trial and error
accumulation of knowledge is incremental, but when the student learns
something that holds a greater structure of knowledge together he can
suddenly use this structure to generate a great many further insights
related to the knowledge. So at this point the incremental accumulation of
knowledge can amplify or leverage the knowledge.

Generally, if the knowledge is useful it will be used and modified.
Sometimes the structure will contain such serious flaws and it will have to
be discarded. Expectation is a part of the process but it is not the only
instrument of confirmation or disconfirmation. For example, if a new pieces
of information can be made sense of based on some recently
acquired structural knowledge it might be used to tie other kinds of
knowledge into the structure. I see this process as the key to
understanding how people make intuitive leaps but it also can explain how
people can go overboard and start converting everything into their theories
even though their theories may seem incomplete to someone else.

The principle here is that a lot of knowledge (including know-how) has to
be structured in order to build good judgment. So, while you might use some
sort of numerical evaluation method in the process the key to understanding
how it works should not be reduced, for example, to an
probabilistic-evaluative-expectation-comparison model. There is more to it
than the evaluative-expectation-comparative model.
Jim Bromer


On Mon, Feb 3, 2014 at 7:50 PM, Samantha Atkins <[email protected]> wrote:

> Experience, from inductive leap, act on its basis giving more experience.
> If the results are more or less as the leap expected then keep it, if not
> then modify or discard it.
>
> Rinse, lather, repeat.   I don't see why that needs to look like more
> random trial and error.  Am I missing something?
>
>
> On Sat, Feb 1, 2014 at 4:14 AM, Jim Bromer <[email protected]> wrote:
>
>> Well, thanks for pointing that out John...
>> But other than being a little tedious, my article wasn't too bad.
>> Jim Bromer
>>
>>
>> On Fri, Jan 31, 2014 at 1:57 PM, John Rose <[email protected]>
>> wrote:
>> > That's because the text "or the turn-the-crank product of successful
>> formal
>> > methods" is in your ideas summary not because anyone would think you're
>> a
>> > crank or anything. Though coincidentally if you search on "Jim Bromer"
>> > "inaccurate theories" it narrows it down to only one result :)
>> >
>> > John
>> >
>> > -----Original Message-----
>> > From: Jim Bromer [mailto:[email protected]]
>> > Sent: Thursday, January 30, 2014 5:38 PM
>> > To: AGI
>> > Subject: [agi] "Jim Bromer" crank
>> >
>> >  I put
>> > "Jim Bromer" crank
>> > in Bing and it came up with these two links:
>> >
>> >
>> > Re: [agi] Summary of My Current Theory For an AGI Program ...
>> >
>> > weyounet.info/2013/04/re-agi-summary-of-my-current-theory-for-an...
>> >
>> >
>> > Summary of My Current Theories for an AGI Program.
>> >
>> > www.jimbromer.com/SummaryOfMyCurrentIdeasAboutAGI.html
>> >
>> > So it looks like my critics must have been right...
>> > :(
>> >
>> > Jim Bromer
>> >
>> >
>> > -------------------------------------------
>> > AGI
>> > Archives: https://www.listbox.com/member/archive/303/=now
>> > RSS Feed:
>> https://www.listbox.com/member/archive/rss/303/248029-3b178a58
>> > Modify Your Subscription:
>> > https://www.listbox.com/member/?&;
>> > Powered by Listbox: http://www.listbox.com
>> >
>> >
>> >
>> >
>> > -------------------------------------------
>> > AGI
>> > Archives: https://www.listbox.com/member/archive/303/=now
>> > RSS Feed:
>> https://www.listbox.com/member/archive/rss/303/24379807-f5817f28
>> > Modify Your Subscription: https://www.listbox.com/member/?&;
>>
>> > Powered by Listbox: http://www.listbox.com
>>
>>
>> -------------------------------------------
>> AGI
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed: https://www.listbox.com/member/archive/rss/303/2997756-fc0b9b09
>>
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/24379807-f5817f28> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to