On Sep 21, 2004, at 4:55 PM, Ben Goertzel wrote:
* Similarly, pure logical reasoning systems like NARS are capable of general
intelligence only when supplied with infeasibly much computing power
I don't think this follows.
I would make this assertion about classical logical reasoning systems, but non-axiomatic systems like NARS are inherently scalable. If the amount of computing power required for NARS is "infeasible", it is because it was poorly engineered in implementation, not because it is mandated by the algorithmic nature of such systems.
Why axiomatic reasoning systems are intractable while non-axiomatic reasoning systems are most certainly tractable is a subtly in the mathematics that seems to be lost on many people every time discussions of it happen -- I've had this come up many times. Non-axiomatic models allow efficient representation and information coding modes that have no analog in axiomatic models. This seems to get glossed over, probably because discussions of these things never leave the very high conceptual level.
* However, I think that evolutionary programming algorithms and logical reasoning systems may both be incorporated as components of Artificial General Intelligence systems that can achieve decent levels of AGI with feasibly much computing power
Again, I don't see how this follows.
Hacking together multiple representations is inherently non-scalable in computer science, as it forces exponential complexities and really doesn't allow for any universal "friendly exponent" approximations. How this can be considered "computationally feasible" while something like NARS is not does not appear to square with any scalable software design theory that I'm familiar with.
This isn't really about AGI at all, as these objections would apply to any kind of ordinary scalable systems software engineering. I've often thought that half the problem with AGI wasn't the theory per se but the lack of good knowledge of what theoretically correct design of the abstract concepts should look like. Too much ivory tower, not enough field engineer. :-)
cheers,
j. andrew rogers
-------
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
