Hmm... I didn't mean infinite evidence, only infinite time and space
with which to compute the consequences of evidence. But that is
interesting too.

The higher-order probabilities I'm talking about introducing do not
reflect inaccuracy at all. :)
This may seem odd, but it seems to me to follow from your development
of NARS... so the difficulty for me is to account for why you can
exclude it in your system. Of course, this need only arises from
interpreting your definitions probabilistically.

I think I have come up with a more specific proposal. I will try to
write it up properly and see if it works.

--Abram

On Sat, Sep 20, 2008 at 11:28 PM, Pei Wang <[EMAIL PROTECTED]> wrote:
> On Sat, Sep 20, 2008 at 11:02 PM, Abram Demski <[EMAIL PROTECTED]> wrote:
>> You are right in what you say about (1). The truth is, my analysis is
>> meant to apply to NARS operating with unrestricted time and memory
>> resources (which of course is not the point of NARS!). So, the
>> question is whether NARS approaches a probability calculation as it is
>> given more time to use all its data.
>
> That is an interesting question. When the weight of evidence w goes to
> infinite, so does confidence, and frequency converge to the limit of
> positive evidence among all evidence, so it becomes probability, under
> a certain interpretation. Therefore, as far as a single truth value is
> concerned, probability theory is an extreme case of NARS.
>
> However, to take all truth values in the system into account, it is
> not necessarily true, because the two theories specify the relations
> among statements/propositions differently. For example, probability
> theory has conditional B|A, while NARS uses implication A==>B, which
> are similar, but not the same. Of course, there are some overlaps,
> such as disjunction and conjunction, where NARS converges to
> probability theory in the extreme case (infinite evidence).
>
>> As for higher values... NARS and PLN may be using them for the purpose
>> you mention, but that is not the purpose I am giving them in my
>> analysis! In my analysis, I am simply trying to justify the deductions
>> allowed in NARS in a probabilistic way. Higher-order probabilities are
>> potentially useful here because of the way you sum evidence. Simply
>> put, it is as if NARS purposefully ignores the distinction between
>> different probability levels, so that a NARS frequency is also a
>> frequency-of-frequencies and frequency-of-frequency-of frequencies and
>> so on, all the way up.
>
> I see what you mean, but as it is currently defined, in NARS there is
> no need to introduce higher-order probabilities --- frequency is not
> an estimation of a "true probability". It is uncertain because the
> influence of new evidence, not because it is inaccurate.
>
>> The simple way of dealing with this is to say that it is wrong, and
>> results from a confusion of similar-looking mathematical entities.
>> But, to some extent, it is intuitive: I should not care too much in
>> normal reasoning which "level" of inheritance I'm using when I say
>> that a truck is a type of vehicle. So the question is, can this be
>> justified probabilistically? I think I can give a very tentative
>> "yes".
>
> Hopefully we'll know better about that when you explore further. ;-)
>
> Pei
>
>> --Abram
>>
>> On Sat, Sep 20, 2008 at 9:38 PM, Pei Wang <[EMAIL PROTECTED]> wrote:
>>> On Sat, Sep 20, 2008 at 9:09 PM, Abram Demski <[EMAIL PROTECTED]> wrote:
>>>>>
>>>>> (1) In probability theory, an event E has a constant probability P(E)
>>>>> (which can be unknown). Given the assumption of insufficient knowledge
>>>>> and resources, in NARS P(A-->B) would change over time, when more and
>>>>> more evidence is taken into account. This process cannot be treated as
>>>>> conditioning, because, among other things, the system can neither
>>>>> explicitly list all evidence as condition, nor update the probability
>>>>> of all statements in the system for each piece of new evidence (so as
>>>>> to treat all background knowledge as a default condition).
>>>>> Consequently, at any moment P(A-->B) and P(B-->C) may be based on
>>>>> different, though unspecified, data, so it is invalid to use them in a
>>>>> rule to calculate the "probability" of A-->C --- probability theory
>>>>> does not allow cross-distribution probability calculation.
>>>>
>>>> This is not a problem the way I set things up. The likelihood of a
>>>> statement is welcome to change over time, as the evidence changes.
>>>
>>> If each of them is changed independently, you don't have a single
>>> probability distribution anymore, but a bunch of them. In the above
>>> case, you don't really have P(A-->B) and P(B-->C), but P_307(A-->B)
>>> and P_409(B-->C). How can you use two probability values together if
>>> they come from different distributions?
>>>
>>>>> (2) For the same reason, in NARS a statement might get different
>>>>> "probability" attached, when derived from different evidence.
>>>>> Probability theory does not have a general rule to handle
>>>>> inconsistency within a probability distribution.
>>>>
>>>> The same statement holds for PLN, right?
>>>
>>> Yes. Ben proposed a solution, which I won't comment until I see all
>>> the details in the PLN book.
>>>
>>>>> The first half is fine, but the second isn't. As the previous example
>>>>> shows, in NARS a high Confidence does implies that the Frequency value
>>>>> is a good summary of evidence, but a low Confidence does implies that
>>>>> the Frequency is bad, just that it is not very stable.
>>>>
>>>> But I'm not talking about confidence when I say "higher". I'm talking
>>>> about the system of levels I defined, for which it is perfectly OK.
>>>
>>> Yes, but the whole purpose of adding another value is to handle
>>> inconsistency and belief revision. Higher-order probability is
>>> mathematically sound, but won't do this work.
>>>
>>> Think about a concrete example: if from one source the system gets
>>> P(A-->B) = 0.9, and P(P(A-->B) = 0.9) = 0.5, while from another source
>>> P(A-->B) = 0.2, and P(P(A-->B) = 0.2) = 0.7, then what will be the
>>> conclusion when the two sources are considered together?
>>>
>>> Pei
>>>
>>>
>>> -------------------------------------------
>>> agi
>>> Archives: https://www.listbox.com/member/archive/303/=now
>>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>> Powered by Listbox: http://www.listbox.com
>>>
>>
>>
>> -------------------------------------------
>> agi
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>>
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to