>
> >> I guess my previous question was not clear enough: if the only domain
> >> knowledge PLN has is
> >>
> >> > Ben is an author of a book on AGI <tv1>
> >> > This dude is an author of a book on AGI <tv2>
> >>
> >> and
> >>
> >> > Ben is odd <tv1>
> >> > This dude is odd <tv2>
> >>
> >> Will the system derives anything?
> >
> > Yes, via making default assumptions about node probability...
>
> Then what are the conclusions, with their truth-values, in each of the
> two cases?
>


Without node probability tv's, PLN actually behaves pretty similarly
to NARS in this case...

If we have

Ben ==> AGI-author <s1>
Dude ==> AGI-author <s2>
|-
Dude ==> Ben <s3>

the PLN abduction rule would yield

s3  = s1 s2 + w (1-s1)(1-s2)

where w is a parameter of the form

w = p/ (1-p)

and if we set w=1 which is a principle of indifference type
assumption then we just have

s3 = 1 - s1 - s2 + 2s1s2

In any case, regardless of w, s1=s2=1 implies s3=1
in this formula, which is the same answer NARS gives
in this case (of crisp premises)

Similar to NARS, PLN also gives a fairly low confidence
to this case, but the confidence formula is a pain and I
won't write it out here...  (i.e., PLN assigns this a beta
distribution with 1 in its support, but a pretty high variance...)

So, similar to NARS, without node probability info PLN cannot
distinguish the two inference examples I gave .. no system could...

However, PLN incorporates the node probabilities when available,
immediately and easily, without requiring knowledge of math on
the part of the system... and it incorporates them according to Bayes
rule which I believe the right approach ...

What is counterintuitive to me is having an inference engine that
does not immediately and automatically use the node probability info
when it is available...

As evidence about Bayesian neural population coding in the brain suggests,
use of Bayes rule is probably MORE cognitively primary than use of
these other more complex inference rules...

-- ben g


p.s.
details:

In PLN,
simple abduction consists of the inference problem:
Given P(A), P(B), P(C), P(B|A) and P(B|C), find P(C|A).

and the simplest, independence-assumption + Bayes rule based formula
for this is

abdAC:=(sA,sB,sC,sAB,sCB)->(sAB*sCB*sC/sB+(1-sAB)*(1-sBC)*sC/(1-sB))

[or, more fully including all consistency conditions,

abdAC:=
(sA,sB,sC,sAB,sCB)->(sAB*sCB*sC/sB+(1-sAB)*(1-sBC)*sC/(1-sB))*(Heaviside(sAB-max(((sA+sB-1)/sA),0))-Heaviside(sAB-min(1,(sB/sA))))*(Heaviside(sCB-max(((sB+sC-1)/sC),0))-Heaviside(sCB-min(1,(sB/sC))));

]

(This is Maple notation...)



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to