I'm in the process of reading this paper:

http://www.jair.org/papers/paper1410.html

It might answer a couple of your questions. And, it looks like it has
an interesting proposal about generating heuristics from the problem
description. The setting is boolean rather than firs-order. It
discusses the point about resolution being slow in practice.

--Abram Demski

On Tue, Sep 23, 2008 at 3:31 AM, YKY (Yan King Yin)
<[EMAIL PROTECTED]> wrote:
> On Thu, Sep 18, 2008 at 3:06 AM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>>
>> Prolog is not fast, it is painfully slow for complex inferences due to using
>> backtracking as a control mechanism
>>
>> The time-complexity issue that matters for inference engines is
>> inference-control ... i.e. dampening the combinatorial explosion (which
>> backtracking does not do)
>>
>> Time-complexity issues within a single inference step can always be handled
>> via mathematical or code optimization, whereas optimizing inference control
>> is a deep, deep AI problem...
>>
>> So, actually, the main criterion for the AGI-friendliness of an inference
>> scheme is whether it lends itself to flexible, adaptive control via
>>
>> -- taking long-term, cross-problem inference history into account
>>
>> -- learning appropriately from noninferential cognitive mechanisms (e.g.
>> attention allocation...)
>
> (I've been busy implementing my AGI in Lisp recently...)
>
> I think optimization of single inference steps and using global
> heuristics are both important.
>
> Prolog uses backtracking, but in my system I use all sorts of search
> strategies, not to mention abduction and induction.  Also, currently
> I'm using general resolution instead of SLD resolution, which is for
> Horn clauses only.  But one problem I face is that when I want to deal
> with equalities I have to use paramodulation (or some similar trick).
> This makes things more complex and as you know, I don't like it!
>
> I wonder if PLN has a binary-logic subset, or is every TV
> probabilistic by default?
>
> If you have a binary logic subset, then how does that subset differ
> from classical logic?
>
> People have said many times that resolution is inefficient, but I have
> never seen a theorem that says resolution is "slower" than other
> deduction methods such as natural deduction or tableaux.  All such
> talk is based on anecdotal impressions.  Also, I don't see why other
> deduction methods are that much different from resolution since their
> inference steps correspond to resolution steps very closely.  Also, if
> you can apply heuristics in other deduction methods you can do the
> same with resolution.  All in all, I see no reason why resolution is
> inferior.
>
> So I'm wondering if there are some novel way of doing binary that
> somehow makes inference faster than with classical logic.  And exactly
> what is the price to be paid?  What aspects of classical logic are
> lost?
>
> YKY
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to