Perhaps the trick is to move the logic to some space where you can
approximate Cn by a simple matrix multiplication, very much like a kernel
method (perhaps you could apply the kernel trick for some form of Kernel
Regression that gives the projection :-?).

Another (perhaps not so) related idea could be to apply some form of
compressed sensing to the space of the arguments, which basically means
that if the arguments involve a dense region of some (continuous) space,
you may find a basis such that the projection is very sparse without loss
of information.




On Mon, Jan 20, 2014 at 9:15 AM, YKY (Yan King Yin, 甄景贤) <
[email protected]> wrote:

>
> PS:  A simplification is to break the consequence operator into "single
> steps", which is a trick known in classical logic-based AI.  So Cn(F) = Lim
> St^k(F) as k -> infinity, where St is the single-step deduction operator.
>
> Even then, the St operator seems more complex than matrix multiplication,
> as it involves matching the KB of facts with quantified logic formulas
> (known as "rules"), via the unification algorithm.
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/15717384-a248fe41> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to