Nil,

I am wondering about using the URE in two new ways...

1) Pattern Mining

The idea here would be that the core loop of the pattern miner could
be modified to use the URE.

Basically, the step where a pattern is expanded into a set of (zero or
more) larger patterns, would be done by enacting an "expansion rule"
on the pattern.   The premises of the expansion rule would be: the
pattern P being expanded, and another link that connects to P.  The
output of the expansion rule would be an expanded pattern, with a
surprisingness value (which, if it's too low, might cause the expanded
pattern to be rejected).

This is semi-straightforward forward chaining, I think...

2) Grammar parsing

Here the idea would be similar to the "Viterbi link parser" that Linas
got partway through writing a while back

The parser would iterate through a sentence from beginning to end.
As it proceeds through it would maintain a tree of possible parses,
similar to a "backward inference tree".   At each stage, it expands
the tree of possible parses, via using an "expansion rule."   The
expansion rule takes the next word in the sentence, and finds its
disjuncts, and chooses the highest-weight way to match all its
disjuncts with those of words occurring elsewhere in the sentence.
Its output is either to add the links representing the match to the
parse tree it's expanding to create a new possible parse; or to fail.
If it fails then backtracking happens, and some earlier word in the
sentence is subjected to the expansion rule again (but the expansion
rule is tasked with finding the highest-weight way to match the
disjuncts of the word being expanded, that has not been tried and
found to fail already in the context of the rest of the parse tree
being expanded) ...

This would require a modified chainer, i.e. a "backward-forward
chainer" that does forward chaining in a way that enables explicit
backtracking when a dead-end is hit...

...

There is a similarity between these two cases in that they're both to
do with using rules that expand structures...

obviously 1) is similar to

3) expanding the program tree in a MOSES deme using the URE ...

I.e. 3) is basically the same as 1) except that one has an arbitrary
fitness function in place of surprisingness.  The MOSES program tree
node vocabulary in use in a given evolution round, would be specified
by a "pattern template" similar to what one uses in the pattern miner
now to specify what kinds of patterns to search for...

...

Doing 1-3 would make things rather elegant (and combined with our
in-progress replacement of the NLP comprehension pipeline, would make
the whole of OpenCog finally somewhat pretty...)


-- Ben

-- 
Ben Goertzel, PhD
http://goertzel.org

“Our first mothers and fathers … were endowed with intelligence; they
saw and instantly they could see far … they succeeded in knowing all
that there is in the world. When they looked, instantly they saw all
around them, and they contemplated in turn the arch of heaven and the
round face of the earth. … Great was their wisdom …. They were able to
know all....

But the Creator and the Maker did not hear this with pleasure. … ‘Are
they not by nature simple creatures of our making? Must they also be
gods? … What if they do not reproduce and multiply?’

Then the Heart of Heaven blew mist into their eyes, which clouded
their sight as when a mirror is breathed upon. Their eyes were covered
and they could see only what was close, only that was clear to them.”

— Popol Vuh (holy book of the ancient Mayas)

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/CACYTDBcNp8-jwCB3P3hXJYfniT9BdRXxjEhRpUnErS_Sj4fC7Q%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to