EGHeflin said:
The reason is that the approach is essentially 'Asimovian' in nature
and, therefore,
wouldn't result in anything more than perhaps a servile pet, call it
iRobot, which
is always 'less-than-equal' to you and therefore always short of your
goal to achieve
the so called
Consider an analogy. In human culture, there is a rigid distinction
between
man and woman. This makes sense, because there are very few
intermediate cases. True hermaphroditism is about one in a million; and
ambiguous genitala are seen in maybe one in 10,000 - 100,000 On the
other
Kevin Copple wrote:
It seems clear that AGI will be obtained in the foreseeable
future. It also
seems that it will be done with adequate safeguards against a
runaway entity
that will exterminate us humans. Likely it will remain under our control
also.
HOWEVER, this brings up another
Ben Goertzel wrote:
Since I'm too busy studying neuroscience, I simply don't have any
time for learning operating systems. I will therefore either use the
systems I know or the systems that require the least ammount of effort
to learn regardless of their features.
Alan, that sounds like a
ULTIMATE KNOWLEDGE
Our AGI will come to know everything. Every single flap of every
butterfly
wing in all of history. If it has emotions like ours, it may become
rather
depressed and realize that it is all pointless. Maybe we will understand
and agree with the AGI's explanation. What
Ben Goertzel wrote:
Since I'm too busy studying neuroscience, I simply don't have any
time for learning operating systems. I will therefore either use the
systems I know or the systems that require the least ammount of effort
to learn regardless of their features.
Alan, that sounds
I say this as someone who just burned half a week setting up a Linux
network in his study.
Ditto...
The windows 3.11 machine took 10 minutes.
The Leenooks machine took 3 days...
Yeah, that stuff is a pain. But compared to designing,
programming and testing a thinking machine, it's cake,
On Fri, 2003-01-10 at 16:44, Damien Sullivan wrote:
While I'm equally horrified by the idea of someone using DOS as a benchmark,
there is a difference between 'stump' I can't figure this out and 'stump' I
haven't learned much about this.
Aye, I think the reaction is more to an apparent
Eliezer wrote:
James Rogers wrote:
Your intuition is correct, depending on how strict you are about
knowledge. The intrinsic algorithmic information content of any
machine is greater (sometimes much greater) than the algorithmic
information content of its static state. The intrinsic
Friday, January 10, 2003, 10:36:35 PM, Kevin Copple wrote:
KC Well, my The Next Wave post was intended to be humorous. I not that much
KC of a comedian, so I may have weighed in too heavily on apparently serious.
KC Let me apologize to the extent it was a feebly frivolous failure.
The line
Kevin Copple wrote:
Perhaps I am wrong, but my impression is that the talk here about
AGI sense
of self, AGI friendliness, and so on is quite premature.
Attitudes on that vary, I think...
I know that many AGI researchers agree with you, and think such issues are
best deferred till after some
11 matches
Mail list logo