On 05/21/2018 01:48 PM, Ben Goertzel wrote:
But Nil -- those record-keeping links can be put in an auxiliary
Atomspace, not necessarily the same Atomspace where the main thrust of
reasoning is proceeding...
Yes, but for rules like incremental direct calculation, and TV revision
in general, it seems traces do need to be taken into account as
reasoning is taking place. That doesn't mean they have to pollute the
main atomspace, but it does show that some form of attention allocation
/ meta-learning will be necessary. Well, that is true regardless of that
problem, it just adds more weight to the scale.
Nil
On Mon, May 21, 2018 at 8:45 AM, 'Nil Geisweiller' via opencog
<[email protected]> wrote:
On 05/19/2018 09:00 PM, Alexey Potapov wrote:
Our knowledge is built from data. Deduction systems (probabilistic or not)
lack this connection, while functional PPLs are well-suited for this.
Deduction system can be understood very broadly, and may encompass
inferences based on PPL models as well.
PLN definitely draws, at least in principle, the relationship between
deduction and data.
ATM in practice it's a bit lacking though, for instance the link between the
TV
Implication <TV>
P
Q
obtained from instances of P and Q is forgotten after the inference. This
should be corrected. Meaning the inference rule
D ;; <- instances of P and Q
|-
Implication <TV>
P
Q
should be more something like
LinkBetweenDataAndImplication <TV>
D ;; <- instances of P and Q
Implication
P
Q
d ;; <- new instance pair of P and Q
|-
LinkBetweenDataAndImplication <TV_update>
Cons
d
D
Implication
P
Q
It would also provide an incremental way to calculate the TV as opposed to
batch processing every time.
It's kinda scary, computationally wise, but it seems to do well most
inference traces need to be recorded, not just conclusions. Yet another
meta-learning black hole...
Nil
--
You received this message because you are subscribed to the Google Groups
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit
https://groups.google.com/d/msgid/opencog/a9852fa0-fc0d-bedc-7279-750d13ca8599%40gmail.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit
https://groups.google.com/d/msgid/opencog/9631b6d1-6677-81bf-a472-9a844556c773%40gmail.com.
For more options, visit https://groups.google.com/d/optout.