I am not so sure that humans use uncomputable models in any useful sense,
when doing calculus.  Rather, it seems that in practice we use
computable subsets
of an in-principle-uncomputable theory...

Oddly enough, one can make statements *about* uncomputability and
uncomputable entities, using only computable operations within a
formal system...

For instance, one can prove that even if x is an uncomputable real number

x - x = 0

But that doesn't mean one has to be able to hold *any* uncomputable number x
in one's brain...

thus is the power of abstraction, and I don't see why AGIs can't have
it just like
humans do...

Ben

On Fri, Feb 29, 2008 at 4:37 PM, Abram Demski <[EMAIL PROTECTED]> wrote:
> I'm an undergrad who's been lurking here for about a year. It seems to me
> that many people on this list take Solomonoff Induction to be the ideal
> learning technique (for unrestricted computational resources). I'm wondering
> what justification there is for the restriction to turing-machine models of
> the universe that Solomonoff Induction uses. Restricting an AI to computable
> models will obviously make it more realistically manageable. However,
> Solomonoff induction needs infinite computational resources, so this clearly
> isn't a justification.
>
> My concern is that humans make models of the world that are not computable;
> in particular, I'm thinking of the way physicists use differential
> equations. Even if physics itself is computable, the fact that humans use
> incomputable models of it remains. Solomonoff Induction itself is an
> incomputable model of intelligence, so an AI that used Solomonoff Induction
> (even if we could get the infinite computational resources needed) could
> never understand its own learning algorithm. This is an odd position for a
> supposedly universal model of intelligence IMHO.
>
> My thinking is that a more-universal theoretical prior would be a prior over
> logically definable models, some of which will be incomputable.
>
> Any thoughts?
>
>  ________________________________
>
>  agi | Archives | Modify Your Subscription



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to