Abram,

I think the best place to start, in exploring the relation between NARS
and probablity theory, is with Definition 3.7 in the paper

>From Inheritance Relation to Non-Axiomatic
Logic<http://www.cogsci.indiana.edu/pub/wang.inheritance_nal.ps>
[*International Journal of Approximate
Reasoning<http://www.elsevier.com/wps/find/journaldescription.cws_home/505787/description#description>
*, 11(4), 281-319, 1994]

which is downloadable from

http://nars.wang.googlepages.com/nars%3Apublication

It is instructive to look at specific situations, and see how this
definition
leads one to model situations differently from the way one traditionally
uses
probability theory to model such situations.

The next place to look, in exploring this relation, is at the semantics that
3.7 implies for the induction and abduction rules.  Note that unlike in PLN
there are no term (node) probabilities in NARS, so that induction and
abduction cannot rely on Bayes rule or any close analogue of it.  They must
be justified on quite different grounds.  If you can formulate a
probabilistic
justification of NARS induction and abduction truth value formulas, I'll be
quite interested.   I'm not saying it's impossible, just that it's not
obvious ...
one has to grapple with 3.7 and the fact that the NARS relative frequency
w+/w is combining intension and extension in a manner that is unusual
relative to ordinary probabilistic treatments.

The math here is simple enough that one does not need to do hand-wavy
philosophizing ;-) ... it's just elementary algebra.  The subtle part is
really
the semantics, i.e. the way the math is used to model situations.

-- Ben G



On Sat, Sep 20, 2008 at 2:22 PM, Abram Demski <[EMAIL PROTECTED]> wrote:

> It has been mentioned several times on this list that NARS has no
> proper probabilistic interpretation. But, I think I have found one
> that works OK. Not perfectly. There are some differences, but the
> similarity is striking (at least to me).
>
> I imagine that what I have come up with is not too different from what
> Ben Goertzel and Pei Wang have already hashed out in their attempts to
> reconcile the two, but we'll see. The general idea is to treat NARS as
> probability plus a good number of regularity assumptions that justify
> the inference steps of NARS. However, since I make so many
> assumptions, it is very possible that some of them conflict. This
> would show that NARS couldn't fit into probability theory after all,
> but it is still interesting even if that's the case...
>
> So, here's an outline. We start with the primitive inheritance
> relation, A inh B; this could be called "definite inheritance",
> because it means that A inherits all of B's properties, and B inherits
> all of A's instances. B is a superset of A. The truth value is 1 or 0.
> Then, we define "probabilistic inheritance", which carries a
> probability that a given property of B will be inherited by A and that
> a given instance of A will be inherited by B. Probabilistic
> inheritance behaves somewhat like the full NARS inheritance: if we
> reason about likelihoods (the probability of the data assuming (A
> prob_inh B) = x), the math is actually the same EXCEPT we can only use
> primitive inheritance as evidence, so we can't spread evidence around
> the network by (1) treating prob_inh with high evidence as if it were
> primitive inh or (2) attempting to use deduction to accumulate
> evidence as we might want to, so that evidence for "A prob_inh B" and
> evidence for "B prob_inh C" gets combined to evidence for "A prob_inh
> C".
>
> So, we can define a second-order-probabilistic-inheritance "prob_inh2"
> that is for prob_inh what prob_inh is for inh. We can define a
> third-order over the second-order, a fourth over the second, and so
> on. In fact, each of these are generalizations: simple inheritance can
> be seen as a special case of prob_inh (where the probability is 1),
> prob_inh is a special case of prob_inh2, and so on. This means we can
> define an infinite-order probabilistic inheritance, prob_inh_inf,
> which is a generalization of any given level. The truth value of
> prob_inh_inf will be very complicated (since each prob_inhN has a more
> complicated truth value than the last, and prob_inh_inf will include
> the truth values from each level).
>
> My proposal is to add 2 regularity assumptions to this structure.
> First, we assume that the prior over probability values for prob_inh
> is even. This givens us some permission to act like the probability
> and the likelihood are the same thing, which brings the math closer to
> NARS. Second, assume that a "high" truth value on one level strongly
> implies a high one on the next value, and similarly that low implies
> low. They will already weakly imply eachother, but I think the math
> could be brought closer to NARS with a stronger assumption. I don't
> have any precise suggestions however. The idea here is to allow
> evidence that properly should only be counted for prob_inh2 to cound
> for prob_inh as well, which is the case in NARS. This is point (1)
> above. More generally, it justifies the NARSian practice of using the
> simple prob_inh likelihood as if it were a likelihood for
> prob_inh_inf, so that it recursively acts on other instances of itself
> rather than only on simple inh.
>
> Of course, since I have not given precise definitions, this solution
> is difficult to evaluate. But, I thought it would be of interest.
>
> --Abram Demski
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"Nothing will ever be attempted if all possible objections must be first
overcome " - Dr Samuel Johnson



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to