Mike Tintner wrote:
RL:Suppose that in some significant part of Novamente there is a
representation system that uses "probability" or "likelihood" numbers to
encode the strength of facts, as in [I like cats](p=0.75). The (p=0.75)
is supposed to express the idea that the statement [I like cats] is in
some sense "75% true".
This essay seems to be a v.g. demonstration of why the human system
almost certainly does not use numbers or anything like, as stores of
value - but raw, crude emotions. "How much do you like cats [or
marshmallow ice cream]?" "Miaow//[or yummy]" [those being an expression
of internal nervous and muscular impulses] "And black cats [or
strawberry marshmallow] ?" "Miaow-miaoww![or yummy yummy]" . It's crude
but it's practical.
It is all a question of what role the numbers play. Conventional AI
wants them at the surface, and transparently interpretable.
I am not saying that there are no numbers, but only that they are below
the surface, and not directly interpretable. that might or might not
gibe with what you are saying ... although I would not go so far as to
put it in the way you do.
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=64636829-14d428