Re: [agi]

2008-03-13 Thread Eric B. Ramsay
So Ben, based on what you are saying, you fully expect them to fail their Turing test? Eric B. Ramsay Ben Goertzel [EMAIL PROTECTED] wrote: I know Selmer and his group pretty well... It is well done stuff, but it is purely hard-coded-knowledge-based logical inference -- there is no real

[agi] Causality challenge

2008-03-07 Thread Eric B. Ramsay
Are any of the AI folks here competing in this challenge? http://www.causality.inf.ethz.ch/challenge.php Eric B. Ramsay --- agi Archives: http://www.listbox.com/member/archive/303/=now RSS Feed: http://www.listbox.com/member/archive/rss/303/ Modify Your

Re: [agi] Circular definitions of intelligence

2007-04-26 Thread Eric B. Ramsay
Several emails ago, both Ben and Richard said they were no longer going to continue this argument, yet here they are - still arguing. Will the definition of intelligence be able to accomodate this behavior by these gentlemen? Benjamin Goertzel [EMAIL PROTECTED] wrote: - When you try

RE: [agi] How should an AGI ponder about mathematics

2007-04-24 Thread Eric B. Ramsay
The more problematic issue is what happens if you non-destructively up-load your mind? What do you do with the original which still considers itself you? The up-load also considers itself you and may suggest a bullet. Matt Mahoney [EMAIL PROTECTED] wrote: --- John G. Rose wrote: A baby AGI

Re: [agi] How should an AGI ponder about mathematics

2007-04-24 Thread Eric B. Ramsay
Your twin example is not a good choice. The upload will consider itself to have a claim on the contents of your life - financial resources for example. Eugen Leitl [EMAIL PROTECTED] wrote: On Tue, Apr 24, 2007 at 07:09:22AM -0700, Eric B. Ramsay wrote: The more problematic issue is what

Re: [tt] [agi] Definition of 'Singularity' and 'Mind'

2007-04-18 Thread Eric B. Ramsay
Actually Richard, these are the things you imagine you would like to do given your current level of intelligence. I suspect very much that the moment you went super intelligent there would be a paradigm change in what you consider fun. Eric Richard Loosemore [EMAIL PROTECTED] wrote: Eugen

[agi] Low I.Q. AGI

2007-04-15 Thread Eric B. Ramsay
There is an easy assumption of most writers on this board that once the AGI exists, it's route to becoming a singularity is a sure thing. Why is that? In humans there is a wide range of smartness in the population. People face intellectual thresholds that they cannot cross because they just do

Re: [agi] A Course on Foundations of Theoretical Psychology...

2007-04-13 Thread Eric B. Ramsay
I would certainly be interested. Ask Ben if you can use the Novamente pavilion in Second Life and conduct the worksop there (or maybe the IBM pavilion which is actually better set up). More people could attend this way and keep costs down. Eric B. Ramsay Richard Loosemore [EMAIL