On Sun, Sep 28, 2008 at 5:21 AM, David Hart <[EMAIL PROTECTED]> wrote:
> Hi YKY,
>
> Can you explain what is meant by "collect commonsense knowledge"?

That means collecting facts and rules.

Example of a commonsense fact:  "apples are red"

Example of a commonsense rule:  "if X is female X has an above-average
chance of having long hair"

> Playing the friendly devil's advocate, I'd like to point out that Cyc seems
> to have been spinning its wheels for 20 years, building a nice big database
> of 'commonsense knowledge' but accomplishing no great leaps in AI. Cyc's
> conundrum is discussed perennialy on various lists with many possible
> explanations posited for Cyc's lackluster performance: Perhaps its krep is
> too brittle and too reduced? Perhaps its ungroundedness is its undoing?
> Perhaps there's no coherent cognitive architecture on which to build an
> effective learning & reasoning system?

IMO Cyc's problem is due to:
1.  the lack of a well-developed probabilistic/fuzzy logic (thus brittleness)
2.  the emphasis on ontology (plain facts) rather than "production rules"

(I say "production rules" to distinguish it from "inference rules"
which are meta-logical.  Though the former term sounds a bit
outdated.)

> Before people volunteer to work on building yet another commonsense
> knowledge system, perhaps they'll want to know how you plan to avoid the Cyc
> problem?

Well, commonsense reasoning has been my area of interest ever since I
started considering AGI.  I don't have a single killer idea that can
solve the problem, but I plan to use:

1.  an approximate probabilistic fuzzy logic
2.  an architecture especially designed for shallow commonsense
reasoning (as opposed to theorem proving and or expert systems type of
inference)
3.  reasoning algorithms such as abduction, induction, belief revision
4.  build an online community to teach the AGI

Not that these ideas are unique to my AGI...  but there exist subtle
and interesting differences amongst us -- I think it is actually a
good thing to have multiple AGI projects, each with its own
personality and way of thinking.

> Even a brief eplanation would be helpful, e.g. the OpenCog Prime design
> plans to address the Cyc problem by learning and reasoning over commonsense
> knowledge that is gained almost entirely by experience (interacting with
> rich environments and human teachers in virtual worlds) rather than by
> attempting to reason over absurdely reduced and brittle bits of hand-encoded
> knowledge. OPC does not represent commonsense knowledge internally
> (natively) with a distinct crisp logical form (the actual form is a topic of
> the OCP tutorial sessions), although it can be directed to transform its
> internal commonsense knowledge representations into such a form over time
> and with much effort. It's my hunch however that such transformations are of
> little practical value; inspecting a compact and formal krep output might
> help researchers evaluate what an OCP system has learned, but 'AGI
> intelligence tests' also work to this end and arguably have significant
> advantages over the non-interactive and detatched examination of krep dumps.

*My view* is that embodiment is not a critical factor -- and yes, I
already know Ben and Pei's view =)

I think I may be able to short-circuit the learning loop by using
"minimal" grounding.  The Helen Keller argument =)

YKY


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to