Matt,
Printing ahh or ouch is just for show. The important observation is that
the program changes its behavior in response to a reinforcement signal in the
same way that animals do.
Let me remind you that the problem we were originally discussing was
about qualia and uploading. Not just about a
.
-Original Message-
From: Matt Mahoney [mailto:[EMAIL PROTECTED]
Sent: Sunday, November 18, 2007 5:32 PM
To: agi@v2.listbox.com
Subject: Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana?
Never!)
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
Matt,
autobliss passes tests
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
Matt,
Printing ahh or ouch is just for show. The important observation is
that
the program changes its behavior in response to a reinforcement signal in
the
same way that animals do.
Let me remind you that the problem we were originally
--- Gary Miller [EMAIL PROTECTED] wrote:
Too complicate things further.
A small percentage of humans perceive pain as pleasure
and prefer it at least in a sexual context or else
fetishes like sadomachism would not exist.
And they do in fact experience pain as a greater pleasure.
More
Matt,
You algorithm is too complex.
What's the point of doing step 1?
Step 2 is sufficient.
Saturday, November 3, 2007, 8:01:45 PM, you wrote:
So we can dispense with the complex steps of making a detailed copy of your
brain and then have it transition into a degenerate state, and just skip
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
We just need to control AGIs goal system.
Matt,
autobliss passes tests for awareness of its inputs and responds as if it has
qualia. How is it fundamentally different from human awareness of pain and
pleasure, or is it just a matter of degree?
If your code has feelings it reports then reversing the order of the
feeling strings (without
Matt Mahoney wrote:
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
We just need to control AGIs goal system.
You can only control the goal system of the first
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
We just need to control AGIs goal system.
You can only control the goal system of the first iteration.
..and you can add rules for it's creations (e.g. stick with the same
Matt Mahoney wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
We just need to control AGIs goal system.
You can only control the goal system of the first iteration.
..and you can add rules for it's creations (e.g. stick with
--- Richard Loosemore [EMAIL PROTECTED] wrote:
Matt Mahoney wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
We just need to control AGIs goal system.
You can only control the goal system of the first iteration.
..and
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
We just need to control AGIs goal system.
You can only control the goal system of the first iteration.
..and you can add rules for it's creations (e.g. stick with the same
goals/rules unless authorized otherwise)
But if
I've often heard people say things like qualia are an illusion or
consciousness is just an illusion, but the concept of an illusion
when applied to the mind is not very helpful, since all our thoughts
and perceptions could be considered as illusions reconstructed from
limited sensory data and
Matt,
We can compute behavior, but nothing indicates we can compute
feelings. Qualia research needed to figure out new platforms for
uploading.
Regards,
Jiri Jelinek
On Nov 4, 2007 1:15 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
Matt,
Create a
Ed,
But I guess I am too much of a product of my upbringing and education
to want only bliss. I like to create things and ideas.
I assume it's because it provides pleasure you are unable to get in
other ways. But there are other ways and if those were easier for you,
you would prefer them over
Jelinek [mailto:[EMAIL PROTECTED]
Sent: Sunday, November 04, 2007 2:59 AM
To: agi@v2.listbox.com
Subject: Re: [agi] Nirvana? Manyana? Never!
Ed,
But I guess I am too much of a product of my upbringing and education
to want only bliss. I like to create things and ideas.
I assume it's because
--- Jiri Jelinek [EMAIL PROTECTED] wrote:
Matt,
Create a numeric pleasure variable in your mind, initialize it with
a positive number and then keep doubling it for some time. Done? How
do you feel? Not a big difference? Oh, keep doubling! ;-))
The point of autobliss.cpp is to illustrate
On 11/4/07, Matt Mahoney [EMAIL PROTECTED] wrote:
Let's say your goal is to stimulate your nucleus accumbens. (Everyone has
this goal; they just don't know it). The problem is that you would forgo
food, water, and sleep until you died (we assume, from animal experiments).
We have no need to
On Nov 3, 2007 12:58 PM, Mike Dougherty [EMAIL PROTECTED] wrote:
You are describing a very convoluted process of drug addiction.
The difference is that I have safety controls built into that scenario.
If I can get you hooked on heroine or crack cocaine, I'm pretty confident
that you will
, November 03, 2007 3:30 PM
To: agi@v2.listbox.com
Subject: Re: [agi] Nirvana? Manyana? Never!
On Nov 3, 2007 12:58 PM, Mike Dougherty [EMAIL PROTECTED] wrote:
You are describing a very convoluted process of drug addiction.
The difference is that I have safety controls built into that scenario
--- Edward W. Porter [EMAIL PROTECTED] wrote:
If bliss without intelligence is the goal of the machines you imaging
running the world, for the cost of supporting one human they could
probably keep at least 100 mice in equal bliss, so if they were driven to
maximize bliss why wouldn't they kill
On 11/2/07, Eliezer S. Yudkowsky wrote:
I didn't ask whether it's possible. I'm quite aware that it's
possible. I'm asking if this is what you want for yourself. Not what
you think that you ought to logically want, but what you really want.
Is this what you lived for? Is this the most
Jiri Jelinek wrote:
Ok, seriously, what's the best possible future for mankind you can imagine?
In other words, where do we want our cool AGIs to get us? I mean
ultimately. What is it at the end as far as you can see?
That's a very personal question, don't you think?
Even the parts I'm
On Fri, Nov 02, 2007 at 12:41:16PM -0400, Jiri Jelinek wrote:
On Nov 2, 2007 2:14 AM, Eliezer S. Yudkowsky [EMAIL PROTECTED] wrote:
if you could have anything you wanted, is this the end you
would wish for yourself, more than anything else?
Yes. But don't forget I would also have AGI
On Fri, Nov 02, 2007 at 01:19:19AM -0400, Jiri Jelinek wrote:
Or do we know anything better?
I sure do. But ask me again, when I'm smarter, and have had more time to
think about the question.
--linas
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change
On Nov 2, 2007 2:14 AM, Eliezer S. Yudkowsky [EMAIL PROTECTED] wrote:
I'm asking if this is what you want for yourself.
Then you could read just the first word from my previous response: YES
if you could have anything you wanted, is this the end you
would wish for yourself, more than anything
Jiri Jelinek wrote:
On Nov 2, 2007 4:54 AM, Vladimir Nesov [EMAIL PROTECTED] wrote:
You turn it into a tautology by mistaking 'goals' in general for
'feelings'. Feelings form one, somewhat significant at this point,
part of our goal system. But intelligent part of goal system is much
more
Jiri,
You turn it into a tautology by mistaking 'goals' in general for
'feelings'. Feelings form one, somewhat significant at this point,
part of our goal system. But intelligent part of goal system is much
more 'complex' thing and can also act as a goal in itself. You can say
that AGIs will be
Linas, BillK
It might currently be hard to accept for association-based human
minds, but things like roses, power-over-others, being worshiped
or loved are just waste of time with indirect feeling triggers
(assuming the nearly-unlimited ability to optimize).
Regards,
Jiri Jelinek
On Nov 2, 2007
On Nov 2, 2007 2:35 PM, Vladimir Nesov [EMAIL PROTECTED] wrote:
Could you please provide one specific example of a human goal which
isn't feeling-based?
It depends on what you mean by 'based' and 'goal'. Does any choice
qualify as a goal? For example, if I choose to write certain word in
Is this really what you *want*?
Out of all the infinite possibilities, this is the world in which you
would most want to live?
Yes, great feelings only (for as many people as possible) and the
engine being continuously improved by AGI which would also take care
of all related tasks including
ED So is the envisioned world is one in which people are on something
equivalent to a perpetual heroin or crystal meth rush?
Kind of, except it would be safe.
If so, since most current humans wouldn't have much use for such people, I
don't know why self-respecting productive human-level AGIs
Jiri Jelinek wrote:
Let's go to an extreme: Imagine being an immortal idiot.. No matter
what you do how hard you try, the others will be always so much
better in everything that you will eventually become totally
discouraged or even afraid to touch anything because it would just
always
On Nov 2, 2007 1:19 PM, Jiri Jelinek [EMAIL PROTECTED] wrote:
Is this really what you *want*?
Out of all the infinite possibilities, this is the world in which you
would most want to live?
Yes, great feelings only (for as many people as possible) and the
engine being continuously
Stefan,
closing your eyes to reality. This is bad because you
effectively deny yourself the potential for further increasing your fitness
I'm closing my eyes, but my AGI - which is an extension of my
intelligence (/me) - does not. I fact it opens them more than I could.
We and our AGI should
Jiri Jelinek wrote:
Is this really what you *want*?
Out of all the infinite possibilities, this is the world in which you
would most want to live?
Yes, great feelings only (for as many people as possible) and the
engine being continuously improved by AGI which would also take care
of all
36 matches
Mail list logo