YKY,

I'm with Pei on this one...

Decades of trying to do procedure learning using logic have led only
to some very
brittle planners that are useful under very special and restrictive
assumptions...

Some of that work is useful but it doesn't seem to me to be pointing in an AGI
direction.

OTOH for instance evolutionary learning and NN's have been more successful
at learning simple procedures for embodied action.

Within NM we have done (and published) experiments using probabilistic logic
for procedure learning, so I'm well aware it can be done.  But I don't
think it's a
scalable approach.

There appears to be a solid information-theoretic reason that the human brain
represents and manipulates declarative, procedural and episodic knowledge
separately.

It's more complex, but I believe it's a better idea to have separate methods for
representing and learning/adapting procedural vs declarative knowledge
--- and then
have routines for converting btw the two forms of knowledge.

One advantage AGIs will have over humans is better methods for translating
procedural to declarative knowledge, and vice versa.

For us to translate "knowing how to do X" into
"knowing how we do X" can be really difficult (I play piano
improvisationally and by
ear, and I have a hard time figuring out what the hell my fingers are
doing, even though
they do the same complex things repeatedly each time I play the same
song..).  This is
not a trivial problem for AGIs either but it won't be as hard as for humans...

-- Ben G

On Tue, Feb 26, 2008 at 8:00 AM, Pei Wang <[EMAIL PROTECTED]> wrote:
> On Tue, Feb 26, 2008 at 7:03 AM, YKY (Yan King Yin)
>  <[EMAIL PROTECTED]> wrote:
>  >
>  > On 2/15/08, Pei Wang <[EMAIL PROTECTED]> wrote:
>  > >
>  > > To me, the following two questions are independent of each other:
>  > >
>  >  > *. What type of reasoning is needed for AI? The major answers are:
>  > > (A): deduction only, (B) multiple types, including deduction,
>  > > induction, abduction, analogy, etc.
>  > >
>  > > *. What type of knowledge should be reasoned upon? The major answers
>  >  > are: (1) declarative only, (2) declarative and procedural.
>  > >
>  > > All four combination of the two answers are possible. Cyc is mainly
>  > > A1; you seem to suggest A2; in NARS it is B2.
>  >
>  >
>  > My current approach is "B1".  I'm wondering what is your argument for
>  > including procedural knowledge, in addition to declarative?
>
>  You have mentioned the reason in the following: some important
>  knowledge is procedural by nature.
>
>
>  > There is the idea of "deductive planning" which allows us to plan actions
>  > using a solely declarative KB.  So procedural knowledge is not needed for
>  > acting.
>
>  I haven't seen any no trivial result supporting this claim.
>
>
>  > Also, if you include procedural knowledge, things may be learned doubly in
>  > your KB.  For example, you may learn some declarative knowledge about the
>  > concept of "reverse" and also procedural knowledge of how to reverse
>  > sequences.
>
>  The knowledge about "how to do ..." can either be in procedural form,
>  as "programs", or in declarative, as descriptions of the programs.
>  There is overlapping/redundancy information in the two, but very often
>  both are needed, and the redundancy is tolerated.
>
>
>  > Even worse, in some cases you may only have procedural knowledge, without
>  > anything declarative.  That'd be like the intelligence of a calculator,
>  > without true understanding of maths.
>
>  Yes, but that is exactly the reason to directly reasoning on
>  procedural knowledge, right?
>
>  Pei
>
>
>  > YKY
>  >
>  >
>  >  ________________________________
>  >
>  >  agi | Archives | Modify Your Subscription
>
>
>
> -------------------------------------------
>  agi
>  Archives: http://www.listbox.com/member/archive/303/=now
>  RSS Feed: http://www.listbox.com/member/archive/rss/303/
>  Modify Your Subscription: http://www.listbox.com/member/?&;
>  Powered by Listbox: http://www.listbox.com
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to