> > (2) For the same reason, in NARS a statement might get different
> > "probability" attached, when derived from different evidence.
> > Probability theory does not have a general rule to handle
> > inconsistency within a probability distribution.
>
> The same statement holds for PLN, right?


PLN handles inconsistency within probability distributions using
higher-order probabilities... both explicitly and, more simply, by allowing
multiple inconsistent estimates of the same distribution to exist attached
to the same node or link...



>
> > If you work out a detailed solution along your path, you will see that
> > it will be similar to NARS when both are doing deduction with strong
> > evidence. The difference will show up (1) in cases where evidence is
> > rare, and (2) in non-deductive inferences, such as induction and
> > abduction. I believe this is also where NARS and PLN differ most.
>
> Guilty as charged! I have only tried to justify the deduction rule,
> not any of the others. I seriously didn't think about the blind spot
> until you mentioned it. I'll have to go back and take a closer look...


NARS deduction rule closely approximates the PLN deduction rule for the case
where all the premise terms have roughly the same node probability.  It
particularly closely approximates the "concept geometry based" variant of
the PLN deduction rule, which is interesting: it means NARS deduction
approximates the PLN deduction rule  variant one gets if one assumes
concepts are approximately spherically-shaped rather than being random sets.

NARS induction and abduction rules to not closely approximate the PLN
induction and abduction rules...

-- Ben G



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to