Matt,
Thank you for your reply. For me it is very thought provoking.
-Original Message-
From: Matt Mahoney [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 12, 2008 7:23 PM
To: agi@v2.listbox.com
Subject: RE: [agi] IBM, Los Alamos scientists claim fastest computer
--- On Thu, 6/12/08,
There've been enough responses to this that I will reply in generalities, and
hope I cover everything important...
When I described Nirvana attractractors as a problem for AGI, I meant that in
the sense that they form a substantial challenge for the designer (as do many
other
Most people are about as happy as they make up their minds to be.
-- Abraham Lincoln
In our society, after a certain point where we've taken care of our
immediate needs, arguably we humans are and should be subject to the Nirvana
effect.
Deciding that you can settle for something (if your
In my visualization of the Cosmic All, it is not surprising.
However, there is an undercurrent of the Singularity/AGI community that is
somewhat apocaliptic in tone, and which (to my mind) seems to imply or assume
that somebody will discover a Good Trick for self-improving AIs and the jig
will
--- On Fri, 6/13/08, Ed Porter [EMAIL PROTECTED] wrote:
[Ed Porter] -- Why couldn't each of the 10^6 fibers
have multiple connections along its length within the cm^3 (although it
could be represented as one row in the matrix, with individual
connections represented as elements in such a row)
I think that our culture of self-indulgence is to some extent in a Nirvana
attractor. If you think that's a good thing, why shouldn't we
No, I think it's a bad thing. That's why I said This is why pleasure
and lack of pain suck as goals.
However, there is an undercurrent of the
On Fri, Jun 13, 2008 at 1:28 PM, J Storrs Hall, PhD [EMAIL PROTECTED] wrote:
I think that our culture of self-indulgence is to some extent in a Nirvana
attractor. If you think that's a good thing, why shouldn't we all lie around
with wires in our pleasure centers (or hopped up on cocaine, same
Mark,
Assuming that
a) pain avoidance and pleasure seeking are our primary driving forces; and
b) our intelligence wins over our stupidity; and
c) we don't get killed by something we cannot control;
Nirvana is where we go.
Jiri
---
agi
Archives:
Yes, but I strongly disagree with assumption one. Pain avoidance and
pleasure are best viewed as status indicators, not goals.
- Original Message -
From: Jiri Jelinek [EMAIL PROTECTED]
To: agi@v2.listbox.com
Sent: Friday, June 13, 2008 3:42 PM
Subject: Re: [agi] Nirvana
Mark,
a future AGI will probably believe one thing, but act as if it believes
something quite different, for very logical reasons.
I wish/hope we can avoid that. AGI should IMO follow scientific
principles. If its honesty hurts then our society/environment should
be targeted for a change.
Regards,
On Friday 13 June 2008 02:42:10 pm, Steve Richfield wrote:
Buddhism teaches that happiness comes from within, so stop twisting the
world around to make yourself happy, because this can't succeed. However, it
also teaches that all life is sacred, so pay attention to staying healthy.
In short,
a) pain avoidance and pleasure seeking are our primary driving forces;
On Fri, Jun 13, 2008 at 3:47 PM, Mark Waser [EMAIL PROTECTED] wrote:
Yes, but I strongly disagree with assumption one. Pain avoidance and
pleasure are best viewed as status indicators, not goals.
Pain and pleasure [levels]
Buddhism teaches that happiness comes from within, so stop twisting the
world around to make yourself happy, because this can't succeed.
Which is of course false... It might come within but triggers can be
internal as well as external and both work pretty well. For the world
twisting, it's just
Your belief value is irrelevant to reality.
Of course all human activity is associated with pain and pleasure because
evolution gave us pleasure and pain to motivate us to do smart things (as
far as evolution is concerned) and avoid stupid things (and yes, I am
anthropomorphizing evolution
On Fri, Jun 13, 2008 at 6:21 PM, Mark Waser [EMAIL PROTECTED] wrote:
if you wire-head, you go extinct
Doing it today certainly wouldn't be a good idea, but whatever we do
to take care of risks and improvements, our AGI(s) will eventually do
a better job, so why not then?
Regards,
Jiri Jelinek
15 matches
Mail list logo