On Dec 29, 2008, at 10:45 AM, Ben Goertzel wrote:
I expanded a previous blog entry of mine on hypercomputation and AGI
into a conference paper on the topic ... here is a rough draft, on
which I'd appreciate commentary from anyone who's knowledgeable on
the subject:
http://goertzel.org/papers/CognitiveInformaticsHypercomputationPaper.pdf
This is a theoretical rather than practical paper, although it does
attempt to explore some of the practical implications as well --
e.g., in the hypothesis that intelligence does require
hypercomputation, how might one go about creating AGI? I come to a
somewhat surprising conclusion, which is that -- even if
intelligence fundamentally requires hypercomputation -- it could
still be possible to create an AI via making Turing computer
programs ... it just wouldn't be possible to do this in a manner
guided entirely by science; one would need to use some other sort of
guidance too, such as chance, imitation or intuition...
As more of a meta-comment, the whole notion of "hypercomputation"
seems to be muddled, insofar as super-recursive algorithms may be a
limited example of it.
I was doing a lot of work with inductive Turing machines several years
ago, and most of the differences seemed to be definitional e.g. what
constitutes an algorithm or answer. For most practical purposes, the
price of implementing them in conventional discrete space is the
introduction of some (usually acceptable) error. But if they
approximate to the point of functional convergence on a normal Turing
machine... As best I have been able to tell, and I have not really
been paying attention because the arguments seem to mostly be people
talking past each other, is that ITMs raise some interesting
philosophical questions regarding hypercomputation.
We cannot implement a *strict* hypercomputer, but to what extent does
it "count" if we can asymptotically converge on the functional
consequences of a hypercomputer using a normal computer? It suspect
it will be hard to evict the belief in Penrosian magic from the error
bars in any case.
Cheers,
J. Andrew Rogers
-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com