On Fri, 8 Nov 2002, Ben Goertzel wrote:
> Stephen Reed wrote:
> > Regarding Cognitive Cyc, I have completed a UML State Machine interpreter
> > in java that complies with UML v1.4 but stops short of Action Semantics,
> > rather I use the DynamicJava interpreter to interpret java source
> > statements when evaluating transition guard conditions, transition
> > effects, and state entry/exit procedures.  Currently I am authoring Cyc
> > vocabulary to represent the UML objects and relationships, with the goal
> > of having Cyc construct state machine models in the knowledge base that
> > can be subsequently interpreted to make Cyc do things.
>
> That is very interesting.
>
> How do you plan to get there to be a rich collection of connections between
> Cyc's repository of state machine models, and Cyc's primary database of
> human common-sense knowledge?

Cyc has a vocabulary for scripts which are event-types complex enough to
have a temporally ordered set of sub-events.  State machine models will be
derived from scripts.

> I can see how these connections could be built up through experience if Cyc
> were actually controlling a robotic system, for example.

I anticipate that these scripts will be constructed and modified during
the performance of problem solving tasks, for example automatic mapping of
foreign terms to Cyc's reference ontology.

> > I have also investigated the JavaBayes Bayesian inference engine and will
> > connect it to Cyc's Bayesian vocabulary so that Cyc can perform Bayesian
> > inference.  And I have a plan to modestly implement fuzzy inference to
> > support Cyc's existing fuzzy vocabulary.
>
> What kind of approach does JavaBayes use for handling very large Bayesian
> nets?  Gibbs sampling and the other standard MCMC methods seem not to scale
> very well.

I was not aware until now of the scaling issue you describe, but do
envision that the Bayes nets used for causal deliberation will be modest
in size and JavaBayes works very fast with the sample problems provided.

> The following paper seems to contain a tricky approximation algorithm that
> works better than the standard Gibbs sampling method, for very large nets...
>
> http://citeseer.nj.nec.com/jensen96blocking.html

Thanks for the tip!

> I have not tried this algorithm myself, just read the paper, but the
> concepts seem solid to me.
>
> We don't use Bayes nets for inference in Novamente, we have our own
> Probabilistic Term Logic probabilistic reasoning system.
>
> However, we are exploring the use of Bayes nets inside a substantially
> modified version of Pelikan and Goldberg's Bayesian Optimization Algorithm
>
> http://citeseer.nj.nec.com/pelikan99boa.html
>
> which we may use as a component of our parameter optimization and procedure
> learning components.
>
> > And I am beginning to flesh out java classes to implement the NIST/Albus
> > Reference Architecture which will give Cognitive Cyc its cognitive
> > behavior -bit by bit.  The UML state machine interpreter will be the
> > the behavioral framework for implementing the Perception, Value
> > Judgment, and Behavior Generation components of the NIST/Albus Reference
> > Architecture.
>
> I guess that's an OK behavioral framework... the big problem though is how
> you're going to *learn* perceptual and behavioral schemata ("subprograms")
> within Cyc.

This is obvious to me but I am *so* naive in this area - Cyc will learn by
asking for help when it reaches an impasse, and Cyc will learn by
experience when making choices about goals to pursue and actions to
accomplish those goals.

> I strongly suspect that Cyc's collection of logical inference methods are
> all badly inadequate for this purpose....  Do you disagree?

Yes, I agree that passive behavior, and an expressive deductive inference
engine are inadequate, but I hope that Cyc's inference engine will provide
an excellent foundation upon which to construct the Cognitive Cyc
application - which brings in deliberative goal seeking, fuzzy logic,
Bayesian modeling, multi-resolutional hierarchical control structure,
and reflective (self-aware) behavior.  I expect that a very large set of
probabilistic relevance meta-assertions will be learned by Cyc
in order to provide the focus behavior called for by the NIST/Albus
Reference Model Architecture.  These meta assertions would provide focus
during deductive inference by filtering the permitted set of ground facts
and backchaining rules according to the current level of resolution (e.g.
planet, continent, region, city, or neighborhood) and current situation.

-Steve

-- 
===========================================================
Stephen L. Reed                  phone:  512.342.4036
Cycorp, Suite 100                  fax:  512.342.4040
3721 Executive Center Drive      email:  [EMAIL PROTECTED]
Austin, TX 78731                   web:  http://www.cyc.com
         download OpenCyc at http://www.opencyc.org
===========================================================

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/

Reply via email to