On Sat, Nov 03, 2007 at 12:06:48AM +0300, Vladimir Nesov wrote:
> On 11/2/07, Linas Vepstas <[EMAIL PROTECTED]> wrote:
> > On Fri, Nov 02, 2007 at 10:34:26PM +0300, Vladimir Nesov wrote:
> > > On 11/2/07, Linas Vepstas <[EMAIL PROTECTED]> wrote:
> > > > On Fri, Nov 02, 2007 at 08:51:43PM +0300, Vladimir Nesov wrote:
> > > > > But learning problem isn't changed by it. And if you solve the
> > > > > learning problem, you don't need any scaffolding.
> > > >
> > > > But you won't know how to solve the learning problem until you try.
> > >
> > > Until you try to solve the learning problem. How scaffolding-building
> > > can help in solving it?
> >
> > My scaffolding learns. It remembers assertions you make, and it will
> > parrot them back. It checks to see if the assertions you make fits into
> > its beleif network before it actually commits them to memory.
> >
> > It can be told things like "aluminum is a mass noun", and then will start
> > using "aluminum" instead of "the aluminum" or "an aluminum" in future
> > sentences.
> >
> > Sure, I hard-coded the part where "mass nouns don't require an article",
> > that's part of the scaffolding. But that's temporary. That's because
> > the thing isn't yet smart enough to understand what the sentence
> > "mass nouns don't require an article" means.
>
> What I meant is to extract learning and term the rest 'scaffolding'.
> In this case, what system actually learns is tagging of terms
> ('aluminum') with other terms ('is-a-mass-noun'), and this tagging is
> provided directly. So it only learns one term->term mapping, which is
> coded in explicitly through textual interface (scaffolding) when you
> enter phrases like "aluminum is a mass noun". It's hardly a
> perceptible step in learning dynamics prototyping.
1) I did not claim to be doing fundamental or groundbreaking AI
research. In fact, I calimed the opposite: that this has been done
before, and I know that many folks have abandoned this approach.
I am intersted in finding out what the roadblocks were.
2) I recently posed the system a question "what is lincoln?" and it
turns out that opencyc knows about 15 or 20 "lincoln counties"
scattered around the united states. So, instead of having the
thing rattle off all 20 counties, I want it to deduce what they all
have in common, and then respond "and lincoln might be one of many
different counties". I think that this kind of deduction will
be a few hours to implement: pattern match to find common ancestors.
So, after asserting "aluminum is a mass noun", it might plausibly deduce
"most minerals are mass nouns" -- one could call this "data mining".
This would use the same algo as deducing that many of the things called
"lincoln" are "counties".
I want to know how far down this path one can go, and how far anyone has
gone. I can see that it might not be good path, but I don't see any
alternatives at the moment.
--linas
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60624837-d7edb6