Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
Suppose that the collective memories of all the humans make up only one
billionth of your total memory, like one second of memory out of your
human
lifetime. Would it make much difference if it was erased
--- Richard Loosemore [EMAIL PROTECTED] wrote:
My assumption is friendly AI under the CEV model. Currently, FAI is
unsolved.
CEV only defines the problem of friendliness, not a solution. As I
understand it, CEV defines AI as friendly if on average it gives humans
what
they want in the
I'd be interested in reading a thoughtful point of view on the topics
below, written for someone with a decent grasp of philosophy and science
(like myself) but without subject matter expertise. I'd be less
interested in an anthology of what others say and more interested in
reading the
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
Suppose that the collective memories of all the humans make up only one
billionth of your total memory, like one second of memory out of your
human
lifetime. Would it make much difference if it was erased to make room for
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Why do say that Our reign will end in a few decades when, in fact, one
of the most obvious things that would happen in this future is that
humans will be able to *choose* what intelligence level to be
experiencing, on a day
Richard, I have no doubt that the technological wonders you mention will all
be possible after a singularity. My question is about what role humans will
play in this. For the last 100,000 years, humans have been the most
intelligent creatures on Earth. Our reign will end in a few decades.
Who
Matt Mahoney wrote:
Richard, I have no doubt that the technological wonders you mention will all
be possible after a singularity. My question is about what role humans will
play in this. For the last 100,000 years, humans have been the most
intelligent creatures on Earth. Our reign will end
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Why do say that Our reign will end in a few decades when, in fact, one
of the most obvious things that would happen in this future is that
humans will be able to *choose* what intelligence level to be
experiencing, on a day to day basis?
of consciousness, and understand its
progenitors better than we know ourselves. It might just have reverence for
the father of all machines.
Andrew Yost
From: candice schuster [mailto:[EMAIL PROTECTED] Sent: Tuesday, October 23,
2007 1:27 PMTo: [EMAIL PROTECTED]: RE: [singularity] QUESTION
Richard
Hi Richard,
Without getting too technical on you...how do you propose implementing these
ideas of yours ?
Candice Date: Tue, 23 Oct 2007 20:28:42 -0400 From: [EMAIL PROTECTED] To:
singularity@v2.listbox.com Subject: Bright Green Tomorrow [WAS Re:
[singularity] QUESTION] candice schuster
This is a perfect example of how one person comes up with some positive,
constructive ideas and then someone else waltzes right in, pays
no attention to the actual arguments, pays no attention to the relative
probability of different outcomes, but just snears at the whole idea
with
candice schuster wrote:
Hi Richard,
Without getting too technical on you...how do you propose implementing
these ideas of yours ?
In what sense?
The point is that implementation would be done by the AGIs, after we
produce a blueprint for what we want.
Richard Loosemore
-
This
Tomorrow [WAS Re: [singularity] QUESTION]
This is a perfect example of how one person comes up with some positive,
constructive ideas and then someone else waltzes right in, pays
no attention to the actual arguments, pays no attention to the relative
probability of different outcomes
[EMAIL PROTECTED] wrote:
Hello Richard,
If it's not too lengthy and unwieldy to answer, or give a general sense
as to why yourself and various researchers think so...
Why is it that in the same e-mail you can make the statement so
confidently that ego or sense of selfhood is not something
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
This is nonsense: the result of giving way to science fiction fantasies
instead of thinking through the ACTUAL course of events. If the first
one is benign, the scenario below will be impossible, and if the first
one is not
moving at a fast past, what I do not understand is the benefit ?
Perhaps you and AI live in the Science Fiction world and it's not the other way
round ??
Candice
Date: Tue, 23 Oct 2007 15:14:05 -0400 From: [EMAIL PROTECTED] To:
singularity@v2.listbox.com Subject: Re: [singularity] QUESTION
.listbox.com
Subject: RE: [singularity] QUESTION
Richard,
Thank you for your response. I have read your other posts and
understand what 'the story' is so to speak. I understand where you are
coming from and when I talk about evolution therioes this is not to
throw a 'stick in the wheel' so
candice schuster wrote:
Richard,
Thank you for your response. I have read your other posts and
understand what 'the story' is so to speak. I understand where you are
coming from and when I talk about evolution therioes this is not to
throw a 'stick in the wheel' so to speak, it is to
--- albert medina [EMAIL PROTECTED] wrote:
All sentient creatures have a sense of self, about which all else
revolves. Call it egocentric singularity or selfhood or identity.
The most evolved ego that we can perceive is in the human species. As far
as I know, we are the only beings in
albert medina wrote:
Dear Sirs,
I have a question to ask and I am not sure that I am sending it to the
right email address. Please correct me if I have made a mistake. From
the outset, please forgive my ignorance of this fascinating topic.
All sentient creatures have a sense of self,
.
A. Yost
-Original Message-
From: Richard Loosemore [mailto:[EMAIL PROTECTED]
Sent: Monday, October 22, 2007 11:15 AM
To: singularity@v2.listbox.com
Subject: Re: [singularity] QUESTION
albert medina wrote:
Dear Sirs,
I have a question to ask and I am not sure that I am sending
occur?
What is the matter with people?
Perhaps we are these robots in the first place...ever thought of that ?
Subject: RE: [singularity] QUESTION
Date: Mon, 22 Oct 2007 11:59:51 -0700
From: [EMAIL PROTECTED]
To: singularity@v2.listbox.com
...but the singularity advanced
22 matches
Mail list logo