The NARS implementations I've been using do not explicitly have a
simulation model, but there is probably a way of forming the Narsese to get
the results you're looking for.  My intuition is that you'd be looking for
frequency/confidence scores in a parent statement which indicate a
successful path to a goal state.

At first glance your problem sounded like a Monte Carlo Tree Search
candidate, since the strength of a decision tree node is based on a
sampling of its child nodes' success rate.  (And of course the search part
can be informed rather than truly random, so you can discard
nonsensical choices.)

On Sat, Sep 11, 2021 at 2:48 AM YKY (Yan King Yin, 甄景贤) <
[email protected]> wrote:

> When thinking about the game of Tic Tac Toe,
> I found that it is most natural to allow assumptions in the logic rules.
> 
> For example, in the definition of a potential "fork",
> in which the player X can win in 2 ways.
> 
> How can we write the rules to determine a potential fork?
> Here is a very "natural" way to state it:
> 
> assume X plays move a:
>     assume O plays an arbitrary (non-winning) move,
>         assume X plays move b then X wins,
>         or, assume X plays move c then X wins,
>     and b != c
> then x is a potential fork.
> 
> So I wonder how can a logic inference engine handle assumptions?
> Does OpenCog or NARS have this ability?
> 
> Thanks :)
> YKY


-- 
Daniel Jue
Cognami LLC
240-515-7802
www.cognami.ai

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T74958068c4e0a30f-M6fb3874e720945ed9f444b51
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to