RE: The moral dimension of simulation

2006-08-13 Thread Stathis Papaioannou

David Nyman writes:

> > They're not just simulating us, are they? They might have just slapped
> > together a virtual universe in an idle moment to see how it turns out. Maybe
> > they're more interested in star formation, or bacteria or something. Is an 
> > E. coli
> > in your gut justified in thinking God made the universe, including human 
> > guts,
> > just for its benefit?
> 
> Stathis
> 
> I see what you mean, of course. However, it's not really what I was
> trying to elicit by my original post. If I were to try to justify my
> actions to you in the sort of way you describe above, I don't think
> you'd be very accepting of this, nor would much of the rest of society.
> I don't mean to say that there isn't a great deal of hypocrisy and
> deviation from ethical conduct in the real world, but unless one is
> prepared to discard the project of working together to make things
> better rather than worse, I believe that we should take ethical
> dialogue seriously. My sense is that much more advanced civilisations
> would have developed in this area too, not just technologically - for
> one thing, they have presumably found ways to live in harmony and not
> self-destruct.  So at the least these issues would have meaning for
> them.
> 
> That's why I feel that your dismissal of the issues isn't very
> illuminating. BTW, I don't intend this as a complaint, I'm just
> clarifying what I had in mind in my original questions - that it would
> be interesting to explore the ethical dimensions of possible simulaters
> and their simulations. I think you're saying that we can't know and
> shouldn't care, which I don't find very interesting.
> 
> As a challenge to your view, might I suggest that in your example re
> the E. coli - if we knew that the E. coli was conscious and had
> feelings, we might be more concerned about it. Do you think it's a
> reasonable assumption that technologists capable enough to include us
> in their simulation, regardless of their 'ultimate purpose', would a)
> not know we had consciousness and feelings, or b) not care, and if so,
> on what justification? Or is this simply unfathomable? I'm not asking
> rhetorically, I'm really interested.
> 
> David

I am proposing, entirely seriously, that it would only be by incredible luck 
that entities vastly superior to us, or even just vastly different, would 
be able to empathise with us on any level, or share our ethical standards 
or any of our other cultural attributes. Do you *really* think that if we 
somehow discovered E. coli had a rudimentary consciousness we would 
behave any differently towards them? Or even if we discovered that each 
E. coli contained an entire universe full of zillions of intelligent beings, 
so that we commit genocide on a scale beyond comprehension every time 
we boil water? I think we would all just say "too bad, we have to look after 
ourselves."

Sentience and intelligence are *not* synonymous with what we think of as 
ethical behaviour. There is no logical contradiction in an intelligent being 
who, 
for example, does not feel physical or psychological pain, or who does but 
devotes his life to wiping out every other sentient being which might compete 
with him. Perhaps we could expect that if we were in a simulation our 
creators might empathise with us if we had been deliberately made in their 
image, but there is no evidence that this is so, any more than there is 
evidence for an omnipotent, personal God. The universe was set in motion 
with the fundamental laws of physics (or whatever), and humans are just a 
tiny and insignificant part of the result, here and gone in an eyeblink. There 
isn't even any evidence that human-level intelligence is an especially useful 
evolutionary strategy.

Stathis Papaioannou
_
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-10 Thread jamikes


- Original Message -
From: "David Nyman" <[EMAIL PROTECTED]>
To: "Everything List" 
Sent: Wednesday, August 09, 2006 12:10 PM
Subject: Re: The moral dimension of simulation


>
> <[EMAIL PROTECTED]> wrote:
>
> > I think "we simulate what we are living in" according to the little we
know.
> > Such 'simulation' - 'simplification' - 'modeling' - 'metaphorizing' - or
> > weven
> > 'harrypotterizing' things we think does not change the
"unknown/unknowable"
> > we live in. We just think "and therefore we think we are.
>
> John
>
David wrote:
> I'm encouraged by the above to ask if you have any views deriving from
> this vis-a-vis the 'first person prime' thread?
>
> David

The thread was much much more than I could attentively follow. My vocabulary
is different from most posts and so the 'first person prime' is hard to
comprehend.
My views do not derive from that thread.  'My' 1st person views are derived
from 'impacts' (I will accept a better word) I get - interpreted (adjusted?)
according to my  'mindcontent' - experinece, knowledge-base, personality, -
which means that it is by no means primary. My percept of reality is a
composite of them all.
Yours is different. If you tell me about yours, I will 'catch' them in the
form my 1st person(ality) understands them, not as you thought it.
I wonder if I "caught" your question?

John M
>


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



RE: The moral dimension of simulation

2006-08-09 Thread Nick Prince

We kill cows to eat them. I feel sure they are conscious beings yet the
killing goes on.  Inhabitants of simulations or of this world seem to opt
for what suits them.  However, slavery was changed and maybe animal rights
will change as we humans get more thoughtful and considerate.  Like I said,
I hope simulator's are moral.  If the simulator's save the stages though
they can always resurrect the suffering souls to correct their moral
mistakes.  As Bostrom says you could always return to some datum point and
rerun the simulation to get rid of or erase a part which has a lot of
suffering in it.  Unfortunately, like entropy the sum total of suffering in
world + simulation always increases.

Nick Prince


-Original Message-
From: everything-list@googlegroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of David Nyman
Sent: 09 August 2006 14:49
To: Everything List
Subject: Re: The moral dimension of simulation


Stathis Papaioannou wrote:

> They're not just simulating us, are they? They might have just slapped
> together a virtual universe in an idle moment to see how it turns out.
Maybe
> they're more interested in star formation, or bacteria or something. Is an
E. coli
> in your gut justified in thinking God made the universe, including human
guts,
> just for its benefit?

Stathis

I see what you mean, of course. However, it's not really what I was
trying to elicit by my original post. If I were to try to justify my
actions to you in the sort of way you describe above, I don't think
you'd be very accepting of this, nor would much of the rest of society.
I don't mean to say that there isn't a great deal of hypocrisy and
deviation from ethical conduct in the real world, but unless one is
prepared to discard the project of working together to make things
better rather than worse, I believe that we should take ethical
dialogue seriously. My sense is that much more advanced civilisations
would have developed in this area too, not just technologically - for
one thing, they have presumably found ways to live in harmony and not
self-destruct.  So at the least these issues would have meaning for
them.

That's why I feel that your dismissal of the issues isn't very
illuminating. BTW, I don't intend this as a complaint, I'm just
clarifying what I had in mind in my original questions - that it would
be interesting to explore the ethical dimensions of possible simulaters
and their simulations. I think you're saying that we can't know and
shouldn't care, which I don't find very interesting.

As a challenge to your view, might I suggest that in your example re
the E. coli - if we knew that the E. coli was conscious and had
feelings, we might be more concerned about it. Do you think it's a
reasonable assumption that technologists capable enough to include us
in their simulation, regardless of their 'ultimate purpose', would a)
not know we had consciousness and feelings, or b) not care, and if so,
on what justification? Or is this simply unfathomable? I'm not asking
rhetorically, I'm really interested.

David

> Brent Meeker writes:
>
> > David Nyman wrote:
> > > Stathis Papaioannou wrote:
> > >
> > >
> > >>Perhaps it says something about the nature of the simulation's
creators,
> > >>but I don't see that it says anything about the probability that we
are
> > >>living in one.
> > >
> > >
> > > Do you mean that if we are living in one, then the moral standards of
> > > its creators are reprehensible (to our way of thinking) or at least
> > > opaque?
> >
> > But the hypothesis that the creators are like us is part of the
> > justification for supposing they would run simulations of intelligent
> > beings.  If you then argue that their motivations and ethics might be
alien
> > to us, you've discarded any reason for supposing they would simulate us.
>
> They're not just simulating us, are they? They might have just slapped
> together a virtual universe in an idle moment to see how it turns out.
Maybe
> they're more interested in star formation, or bacteria or something. Is an
E. coli
> in your gut justified in thinking God made the universe, including human
guts,
> just for its benefit?
>
> Stathis Papaioannou
> _
> Be one of the first to try Windows Live Mail.
>
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-491
1fb2b2e6d






-- 
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.394 / Virus Database: 268.10.8/415 - Release Date: 09/08/2006




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-09 Thread David Nyman

<[EMAIL PROTECTED]> wrote:

> I think "we simulate what we are living in" according to the little we know.
> Such 'simulation' - 'simplification' - 'modeling' - 'metaphorizing' - or
> weven
> 'harrypotterizing' things we think does not change the "unknown/unknowable"
> we live in. We just think "and therefore we think we are.

John

I'm encouraged by the above to ask if you have any views deriving from
this vis-a-vis the 'first person prime' thread?

David

> Nick (and List):
> just a short remark to the very first words of your post below (mostly
> erased):
> >
> > If we are living in a simulation .<
> I think this is the usual pretension (not only on this list).
> I think "we simulate what we are living in" according to the little we know.
> Such 'simulation' - 'simplification' - 'modeling' - 'metaphorizing' - or
> weven
> 'harrypotterizing' things we think does not change the "unknown/unknowable"
> we live in. We just think "and therefore we think we are.
>
> Most ignorantly and commonsensically yours
>
> John M
>
>
> - Original Message -
> From: "Nick Prince" <[EMAIL PROTECTED]>
> To: 
> Sent: Tuesday, August 08, 2006 5:05 PM
> Subject: RE: The moral dimension of simulation
>
>
> > If we are living in a simulation (and I believe the matrix hypothesis is a
> > real possibility) and if we are all just software constructs then the
> > architect has some options available to it.
> SNIP


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-09 Thread jamikes

Nick (and List):
just a short remark to the very first words of your post below (mostly
erased):
>
> If we are living in a simulation .<
I think this is the usual pretension (not only on this list).
I think "we simulate what we are living in" according to the little we know.
Such 'simulation' - 'simplification' - 'modeling' - 'metaphorizing' - or
weven
'harrypotterizing' things we think does not change the "unknown/unknowable"
we live in. We just think "and therefore we think we are.

Most ignorantly and commonsensically yours

John M


- Original Message -
From: "Nick Prince" <[EMAIL PROTECTED]>
To: 
Sent: Tuesday, August 08, 2006 5:05 PM
Subject: RE: The moral dimension of simulation


> If we are living in a simulation (and I believe the matrix hypothesis is a
> real possibility) and if we are all just software constructs then the
> architect has some options available to it.
SNIP


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-09 Thread David Nyman

Stathis Papaioannou wrote:

> They're not just simulating us, are they? They might have just slapped
> together a virtual universe in an idle moment to see how it turns out. Maybe
> they're more interested in star formation, or bacteria or something. Is an E. 
> coli
> in your gut justified in thinking God made the universe, including human guts,
> just for its benefit?

Stathis

I see what you mean, of course. However, it's not really what I was
trying to elicit by my original post. If I were to try to justify my
actions to you in the sort of way you describe above, I don't think
you'd be very accepting of this, nor would much of the rest of society.
I don't mean to say that there isn't a great deal of hypocrisy and
deviation from ethical conduct in the real world, but unless one is
prepared to discard the project of working together to make things
better rather than worse, I believe that we should take ethical
dialogue seriously. My sense is that much more advanced civilisations
would have developed in this area too, not just technologically - for
one thing, they have presumably found ways to live in harmony and not
self-destruct.  So at the least these issues would have meaning for
them.

That's why I feel that your dismissal of the issues isn't very
illuminating. BTW, I don't intend this as a complaint, I'm just
clarifying what I had in mind in my original questions - that it would
be interesting to explore the ethical dimensions of possible simulaters
and their simulations. I think you're saying that we can't know and
shouldn't care, which I don't find very interesting.

As a challenge to your view, might I suggest that in your example re
the E. coli - if we knew that the E. coli was conscious and had
feelings, we might be more concerned about it. Do you think it's a
reasonable assumption that technologists capable enough to include us
in their simulation, regardless of their 'ultimate purpose', would a)
not know we had consciousness and feelings, or b) not care, and if so,
on what justification? Or is this simply unfathomable? I'm not asking
rhetorically, I'm really interested.

David

> Brent Meeker writes:
>
> > David Nyman wrote:
> > > Stathis Papaioannou wrote:
> > >
> > >
> > >>Perhaps it says something about the nature of the simulation's creators,
> > >>but I don't see that it says anything about the probability that we are
> > >>living in one.
> > >
> > >
> > > Do you mean that if we are living in one, then the moral standards of
> > > its creators are reprehensible (to our way of thinking) or at least
> > > opaque?
> >
> > But the hypothesis that the creators are like us is part of the
> > justification for supposing they would run simulations of intelligent
> > beings.  If you then argue that their motivations and ethics might be alien
> > to us, you've discarded any reason for supposing they would simulate us.
>
> They're not just simulating us, are they? They might have just slapped
> together a virtual universe in an idle moment to see how it turns out. Maybe
> they're more interested in star formation, or bacteria or something. Is an E. 
> coli
> in your gut justified in thinking God made the universe, including human guts,
> just for its benefit?
>
> Stathis Papaioannou
> _
> Be one of the first to try Windows Live Mail.
> http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



RE: The moral dimension of simulation

2006-08-09 Thread Stathis Papaioannou

Brent Meeker writes:

> David Nyman wrote:
> > Stathis Papaioannou wrote:
> > 
> > 
> >>Perhaps it says something about the nature of the simulation's creators,
> >>but I don't see that it says anything about the probability that we are
> >>living in one.
> > 
> > 
> > Do you mean that if we are living in one, then the moral standards of
> > its creators are reprehensible (to our way of thinking) or at least
> > opaque?  
> 
> But the hypothesis that the creators are like us is part of the 
> justification for supposing they would run simulations of intelligent 
> beings.  If you then argue that their motivations and ethics might be alien 
> to us, you've discarded any reason for supposing they would simulate us.

They're not just simulating us, are they? They might have just slapped 
together a virtual universe in an idle moment to see how it turns out. Maybe 
they're more interested in star formation, or bacteria or something. Is an E. 
coli 
in your gut justified in thinking God made the universe, including human guts, 
just for its benefit?

Stathis Papaioannou
_
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-08 Thread David Nyman

Quentin Anciaux wrote:

> - Why accepting the simulation argument is "simpler" than accepting
> the "multitude sentient life forms hypothesis" ? ;)

Hi Quentin

I think the argument here is based on the presumed lack of practical
constraints on the sheer magnitude of 'simulable observers', which can
be postulated to swamp any countervailing number of 'natural
observers'.  It's because of this that I prefer to focus on other
possible issues constraining such scenarios, because, as Bostrom points
out, if you accept his basic premises (its feasible and some advanced
civilisation would actually do it) then you can always set the
situation up so that an argument based on 'observer moments' seems to
swamp any objections. I can't help feeling there's something fishy
about this. I've never been very happy with the logic of the doomsday
argument either, but that's another topic.

David

> Hi David,
>
> Le Mardi 8 Août 2006 15:47, David Nyman a écrit :
> > I'm not sure that Nick Bostrom et al actually take this view.  Rather
> > the notion seems to be based on the assumptions that if this is a
> > feasible thing to do, and unless you could rule out that *some* future
> > civilisation would actually do it, then the huge number of 'observer
> > moments' thus generated would make it probable that we were in fact
> > living in one.  I don't think there are any other assumptions about the
> > motivations of the simulaters.
>
> what puzzle's me about this, is that accepting this argument (which as similar
> root with the doomsday argument) is accepting also this:
> (here follows assumption about a pure materialist mwi).
>
> 1- Materialist mwi alla everett is true.
> 2- So there exist world in the plenitude which have a lot of observer.
> 3- Then it is more likely that I'm in a branch where a lot of observer exist.
>
> But I'm not (obsviously)... I'm about the 60 billions human born on this
> earth... and this number seems very low (just only for human life form). If
> (physical) mwi is true then I should be part of a universe with a lot of
> sentient life forms, if not I'm a peculiar case and it should be strange that
> I am. So the argument that I should be in a simulation because the majority
> of OM should be in a simulation (because we can't rule out that our future
> civilisations will simulate us if it is possible), means also that we should
> be in a universe that has plenty of sentient life forms (but it seems we're
> not). So my question is simply this :
>
> - Why accepting the simulation argument is "simpler" than accepting
> the "multitude sentient life forms hypothesis" ? ;)
>
> Best regards,
> Quentin
>
> P.S.: I would like to apology to "W.C.", sometimes my thoughts are quicker
> than my fingers /o\
> P.S.2: I also apology form my horrible english ;) One day, I'll be a perfect
> english speaker, but this day hasn't come yet.


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-08 Thread Quentin Anciaux

Hi David,

Le Mardi 8 Août 2006 15:47, David Nyman a écrit :
> I'm not sure that Nick Bostrom et al actually take this view.  Rather
> the notion seems to be based on the assumptions that if this is a
> feasible thing to do, and unless you could rule out that *some* future
> civilisation would actually do it, then the huge number of 'observer
> moments' thus generated would make it probable that we were in fact
> living in one.  I don't think there are any other assumptions about the
> motivations of the simulaters.

what puzzle's me about this, is that accepting this argument (which as similar 
root with the doomsday argument) is accepting also this:
(here follows assumption about a pure materialist mwi).

1- Materialist mwi alla everett is true.
2- So there exist world in the plenitude which have a lot of observer.
3- Then it is more likely that I'm in a branch where a lot of observer exist.

But I'm not (obsviously)... I'm about the 60 billions human born on this 
earth... and this number seems very low (just only for human life form). If 
(physical) mwi is true then I should be part of a universe with a lot of 
sentient life forms, if not I'm a peculiar case and it should be strange that 
I am. So the argument that I should be in a simulation because the majority 
of OM should be in a simulation (because we can't rule out that our future 
civilisations will simulate us if it is possible), means also that we should 
be in a universe that has plenty of sentient life forms (but it seems we're 
not). So my question is simply this :

- Why accepting the simulation argument is "simpler" than accepting 
the "multitude sentient life forms hypothesis" ? ;)

Best regards,
Quentin

P.S.: I would like to apology to "W.C.", sometimes my thoughts are quicker 
than my fingers /o\
P.S.2: I also apology form my horrible english ;) One day, I'll be a perfect 
english speaker, but this day hasn't come yet.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



RE: The moral dimension of simulation

2006-08-08 Thread Nick Prince

If we are living in a simulation (and I believe the matrix hypothesis is a
real possibility) and if we are all just software constructs then the
architect has some options available to it. If it is benevolent, then it
could copy (and save) the state of the simulation at various times.  It
could then hunt through the saved states to find the nearest copy to the
state of death of a "person" (ist person experiencer?) in the simulation and
paste it into a new environment (resurrect it!).  Maybe it might like to
"grow" individuals that learn to be better this way.  This is my wishful
thinking of how the simulation might work and how its creator might
"interfere" with it to get what it wants.  All the usual arguments from
theology about the purpose/reason of evil and suffering etc. can be brought
into the debate wholesale from here on.  However because the architect may
not be benevolent towards us it may not choose to resurrect us.  If it is
intelligent enough to make such a simulation then maybe it is moral too but
some have speculated against this (Dainton for example see: Innocence Lost
Simulation Scenarios: Prospects and Consequences
Barry Dainton (2002, October), The University of Liverpool). I can send you
a pdf if you want but give me an e-mail to send to.

When I look at nature it does seem "bloodied tooth and claw".  Is this what
a "good" simulator would have done? Then again in simulations things don't
always work out the way the maker's figured.  I dunno - I've never managed
to go beyond this because there is no data to support anything other than to
say the simulation argument is compelling.

Nick Prince 

 

-Original Message-
From: everything-list@googlegroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of David Nyman
Sent: 08 August 2006 00:10
To: Everything List
Subject: Re: The moral dimension of simulation


Nick Prince wrote:

> Who says morality to all other species is useful anyway (for survival) and
> even a defining feature of intelligent species?  In war people kill people
> just like themselves, as long as they wear a different uniform! We drop
atom
> bombs and say it was to save life!!(Hiroshima).  This may be true.  Truth
> and morality can get in conflict.  Morality can therefore easily get lost
in
> the fog.  How expensive it may be to run simulations that generate so many
> forms too.  Perhaps some form of superintelligence decides who will live
or
> die in the simulation.  If you've ever been on a hospital waiting list for
a
> really life threatening illness it is clear how priorities can change the
> moral landscape.  If I made a simulation I would want it to be moral but I
> don't don't know what dilemmas the pandora's box generated by the
simulation
> and the financial or unknown constraints far above my knowing would turn
up.
> That's the interesting thing about simulations - they are run to "see what
> might be when we can't guess the answers". Yet, I hope your right and I'm
> wrong.

Nick

I certainly don't know that I'm right and you're wrong.  I didn't post
my original thoughts because I believe that I have knock down arguments
either that we do or should live in a world with specific moral
imperatives, whether or not it is a 'simulation'.  Rather, I hadn't
seen the sorts of value-based issues you mention given an airing in the
somewhat bloodless 'technical' discussions I've previously read on this
possibility.  So my 'questions' aren't at all intended to be merely
rhetorical, but to elicit intelligent responses such as your own.

However,since I do believe that we're lost if we feel we can't take the
issues in our lives seriously, my motive in posting was in a sense to
test whether others felt in any way influenced in their moral or
practical thinking or behaviour by the possibility (some would say
probability) that our lives are simulations being run for somebody
else's benefit. Some parallels with certain sorts of religious outlook
strike me, in that I've often felt that these do indeed encourage just
such an attitude - that our lives should be ultimately dedicated to
some deity's purposes, not to exploring our own.

One thing you don't say in your interesting response is whether such
considerations influence you personally in any way, or whether they're
just a diverting thought experiment. Your comments would be
interesting.

David

> Dear David
>
> Why is it so difficult to conceive that the simulators should be
> unwittingly? Or in some way non ethical and thoughtless of the pain,
fears,
> loves etc of an interesting by product (or even possibly irritating by
> product) of their simulation.  Do you eat meat?  Trap mice? kill flies?
Wash
> bacteria from your hands?  How much  time and 

Re: The moral dimension of simulation

2006-08-08 Thread David Nyman

Brent Meeker wrote:

> But the hypothesis that the creators are like us is part of the
> justification for supposing they would run simulations of intelligent
> beings.  If you then argue that their motivations and ethics might be alien
> to us, you've discarded any reason for supposing they would simulate us.

I'm not sure that Nick Bostrom et al actually take this view.  Rather
the notion seems to be based on the assumptions that if this is a
feasible thing to do, and unless you could rule out that *some* future
civilisation would actually do it, then the huge number of 'observer
moments' thus generated would make it probable that we were in fact
living in one.  I don't think there are any other assumptions about the
motivations of the simulaters.

Notwithstanding this, I'm interested that you feel that their motives
would not be alien to us.  Does this mean to imply that you think that
our current societies would sanction the running of such simulations if
we could (i.e.if we had the technology right now, rather than waiting
until we had evolved into some hypothetical future civilisation)?  How
would you envisage the debate developing (on the model of stem cell
research, right to life, vivisection, etc.)?   I just wonder if you or
anyone else cared to speculate on the direction of moral evolution into
such hypothetical futures, not just the technological developments.

Personally, although I don't lose sleep over these issues right now,
I'm pretty clear that I would be against any such attempts at
simulating 'life', and I'm interested in how you or others might
predict how I and those with similar views would lose this debate.  Or
is it more likely to be some unpoliceable underground phenomenon? Since
you have implied above that their motives should be comprehensible to
us (a point on which others seem to disagree), perhaps you might want
to comment on these aspects.

David

> David Nyman wrote:
> > Stathis Papaioannou wrote:
> >
> >
> >>Perhaps it says something about the nature of the simulation's creators,
> >>but I don't see that it says anything about the probability that we are
> >>living in one.
> >
> >
> > Do you mean that if we are living in one, then the moral standards of
> > its creators are reprehensible (to our way of thinking) or at least
> > opaque?
>
> But the hypothesis that the creators are like us is part of the
> justification for supposing they would run simulations of intelligent
> beings.  If you then argue that their motivations and ethics might be alien
> to us, you've discarded any reason for supposing they would simulate us.
> 
> Brent Meeker


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



RE: The moral dimension of simulation

2006-08-07 Thread Stathis Papaioannou

David Nyman writes:

> Stathis Papaioannou wrote:
> 
> > Perhaps it says something about the nature of the simulation's creators,
> > but I don't see that it says anything about the probability that we are
> > living in one.
> 
> Do you mean that if we are living in one, then the moral standards of
> its creators are reprehensible (to our way of thinking) or at least
> opaque?  Would you feel that our believing this puts us in a position
> in any way different from that of deists traditionally trying to fathom
> what the standards or motives of their gods might be? In either case do
> you have a view on how this could or should affect our own standards or
> conduct within the simulation?

I guess that our simulated universe's creators are deist deities if they are 
anything, 
because there is no evidence that they interfere in our affairs once the 
simulation 
is underway. If they did, then we might learn something about them, but as it 
is 
their motives, moral standards and so on are unknowable. In any case, why would 
near-omnipotent beings who cannot suffer (as you rise up the path towards 
omnipotence, surely eliminating the possibility of unpleasant things happening 
to you or 
your society would be a priority) have a sense of moral responsibility towards 
lesser 
creatures? God's attitude towards us might be like our attitude towards 
bacteria. If we 
discovered that bacteria did, after all, have a rudimentary sentience and at 
the very 
least did not want to be killed, what would we do about it? What would we do if 
we 
could do nothing about it, and understood that we commited mass murder every 
time 
we boiled water? I think we would redefine "murder" so that it didn't apply to 
lesser 
organisms, but only to our own exalted species. If we can justify killing or 
enslaving 
humans with a different skin colour to our own, or other mammalian species, how 
much 
easier would it be for near-omnipotent beings to be indifferent to the 
suffering of mere 
computer software?

As for how knowledge of our creators should affect us, aside from the enormous 
scientific 
interest it would create I don't see that it should make any difference to how 
we live our 
lives. I think even those religious people who base their morality on laws they 
believe to 
be handed down from God would continue, for the most part, living their lives 
the same 
way if they realised that this was a fiction (although no doubt most of them 
would continue 
with their previous faith no matter what evidence they were presented with).

Stathis Papaioannou


> > David Nyman writes:
> >
> > > I don't know whether these issues have been given an airing here, but
> > > I have a couple of thoughts about whether we're really 'in the
> > > Matrix', a la Nick Bostrom.
> > >
> > > Firstly, a moral issue. At least at the level of public debate, in our
> > > (apparent?) reality there is considerable sensitivity to interfering
> > > with fundamental issues of human freedom and dignity, and of avoiding
> > > where possible the infliction of unnecessary suffering, either to
> > > humans or other sentient organisms.  It seems to me that if we are to
> > > take seriously the idea that significant numbers of advanced
> > > civilisations would 'simulate' us in the 'feelingful' way we
> > > (or at least I) experience, that significant moral issues are raised.
> > > These are not dissimilar to the paradoxes raised by the juxtaposition
> > > of an all-loving and omnipotent God.  None of this is to claim a
> > > knock-down argument, but nevertheless it places a constraint on the
> > > kind of 'civilisation' that might undertake such an exercise,
> > > especially in those scenarios that take it to be some sort of game or
> > > entertainment.
> >
> > You're holding the beings running the simulation to awfully high standards.
> > Humans have always persecuted their own kind, let alone other species,
> > and have always managed to find rationalisations to explain why it isn't
> > really "bad". Even if technologically superior alien societies have similar
> > ethics to our own, by no means a given, what if our universe is being
> > simulated by their equivalent of a psychopath, or even a teenager in his
> > bedroom?
> >
> > > Secondly, what sort of role are 'we' supposed to playing?  On the
> > > one hand, we may simply be required to play a part 'intelligently',
> > > or at least predictably, for the benefit of the 'real' players.  In
> > > this case, would they need to go to the trouble of making us
> > > 'sentient'?  Or can we take this as evidence that the complexity
> > > required for 'intelligence' simply gives rise to such sentience?
> >
> > I'd say that our sentience is a side-effect of our intelligence. Even if we
> > are part of the simulation, the simulation seems to consistently follow
> > evolutionary theory, and how or why would sentience develop if it were
> > possible to have the same behaviour without it? I think this is a convincing
> > argume

Re: The moral dimension of simulation

2006-08-07 Thread Brent Meeker

David Nyman wrote:
> Stathis Papaioannou wrote:
> 
> 
>>Perhaps it says something about the nature of the simulation's creators,
>>but I don't see that it says anything about the probability that we are
>>living in one.
> 
> 
> Do you mean that if we are living in one, then the moral standards of
> its creators are reprehensible (to our way of thinking) or at least
> opaque?  

But the hypothesis that the creators are like us is part of the 
justification for supposing they would run simulations of intelligent 
beings.  If you then argue that their motivations and ethics might be alien 
to us, you've discarded any reason for supposing they would simulate us.

Brent Meeker


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-07 Thread David Nyman

Nick Prince wrote:

> Who says morality to all other species is useful anyway (for survival) and
> even a defining feature of intelligent species?  In war people kill people
> just like themselves, as long as they wear a different uniform! We drop atom
> bombs and say it was to save life!!(Hiroshima).  This may be true.  Truth
> and morality can get in conflict.  Morality can therefore easily get lost in
> the fog.  How expensive it may be to run simulations that generate so many
> forms too.  Perhaps some form of superintelligence decides who will live or
> die in the simulation.  If you've ever been on a hospital waiting list for a
> really life threatening illness it is clear how priorities can change the
> moral landscape.  If I made a simulation I would want it to be moral but I
> don't don't know what dilemmas the pandora's box generated by the simulation
> and the financial or unknown constraints far above my knowing would turn up.
> That's the interesting thing about simulations - they are run to "see what
> might be when we can't guess the answers". Yet, I hope your right and I'm
> wrong.

Nick

I certainly don't know that I'm right and you're wrong.  I didn't post
my original thoughts because I believe that I have knock down arguments
either that we do or should live in a world with specific moral
imperatives, whether or not it is a 'simulation'.  Rather, I hadn't
seen the sorts of value-based issues you mention given an airing in the
somewhat bloodless 'technical' discussions I've previously read on this
possibility.  So my 'questions' aren't at all intended to be merely
rhetorical, but to elicit intelligent responses such as your own.

However,since I do believe that we're lost if we feel we can't take the
issues in our lives seriously, my motive in posting was in a sense to
test whether others felt in any way influenced in their moral or
practical thinking or behaviour by the possibility (some would say
probability) that our lives are simulations being run for somebody
else's benefit. Some parallels with certain sorts of religious outlook
strike me, in that I've often felt that these do indeed encourage just
such an attitude - that our lives should be ultimately dedicated to
some deity's purposes, not to exploring our own.

One thing you don't say in your interesting response is whether such
considerations influence you personally in any way, or whether they're
just a diverting thought experiment. Your comments would be
interesting.

David

> Dear David
>
> Why is it so difficult to conceive that the simulators should be
> unwittingly? Or in some way non ethical and thoughtless of the pain, fears,
> loves etc of an interesting by product (or even possibly irritating by
> product) of their simulation.  Do you eat meat?  Trap mice? kill flies? Wash
> bacteria from your hands?  How much  time and concern do you (we) give to
> these  life forms?  For all we know the cockroach may be the purposeful
> study of the simulation we are in - or even whichever species is "the
> surviving species" of interest at t=time to stop.  I know it feels like we
> should be important but, in the scale of things - it's probably just
> wishfull thinking. A hugely more intelligent species may not even be moral.
>
> Who says morality to all other species is useful anyway (for survival) and
> even a defining feature of intelligent species?  In war people kill people
> just like themselves, as long as they wear a different uniform! We drop atom
> bombs and say it was to save life!!(Hiroshima).  This may be true.  Truth
> and morality can get in conflict.  Morality can therefore easily get lost in
> the fog.  How expensive it may be to run simulations that generate so many
> forms too.  Perhaps some form of superintelligence decides who will live or
> die in the simulation.  If you've ever been on a hospital waiting list for a
> really life threatening illness it is clear how priorities can change the
> moral landscape.  If I made a simulation I would want it to be moral but I
> don't don't know what dilemmas the pandora's box generated by the simulation
> and the financial or unknown constraints far above my knowing would turn up.
> That's the interesting thing about simulations - they are run to "see what
> might be when we can't guess the answers". Yet, I hope your right and I'm
> wrong.
>
> Nick Prince
>
>
> -Original Message-
> From: everything-list@googlegroups.com
> [mailto:[EMAIL PROTECTED] On Behalf Of David Nyman
> Sent: 07 August 2006 00:16
> To: Everything List
> Subject: Re: The moral dimension of simulation
>
>
> But your observati

RE: The moral dimension of simulation

2006-08-07 Thread Nick Prince

Dear David

Why is it so difficult to conceive that the simulators should be
unwittingly? Or in some way non ethical and thoughtless of the pain, fears,
loves etc of an interesting by product (or even possibly irritating by
product) of their simulation.  Do you eat meat?  Trap mice? kill flies? Wash
bacteria from your hands?  How much  time and concern do you (we) give to
these  life forms?  For all we know the cockroach may be the purposeful
study of the simulation we are in - or even whichever species is "the
surviving species" of interest at t=time to stop.  I know it feels like we
should be important but, in the scale of things - it's probably just
wishfull thinking. A hugely more intelligent species may not even be moral.

Who says morality to all other species is useful anyway (for survival) and
even a defining feature of intelligent species?  In war people kill people
just like themselves, as long as they wear a different uniform! We drop atom
bombs and say it was to save life!!(Hiroshima).  This may be true.  Truth
and morality can get in conflict.  Morality can therefore easily get lost in
the fog.  How expensive it may be to run simulations that generate so many
forms too.  Perhaps some form of superintelligence decides who will live or
die in the simulation.  If you've ever been on a hospital waiting list for a
really life threatening illness it is clear how priorities can change the
moral landscape.  If I made a simulation I would want it to be moral but I
don't don't know what dilemmas the pandora's box generated by the simulation
and the financial or unknown constraints far above my knowing would turn up.
That's the interesting thing about simulations - they are run to "see what
might be when we can't guess the answers". Yet, I hope your right and I'm
wrong.

Nick Prince

  
-Original Message-
From: everything-list@googlegroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of David Nyman
Sent: 07 August 2006 00:16
To: Everything List
Subject: Re: The moral dimension of simulation


But your observation goes to the heart of my question.  If we were
indeed 'merely incidental' (from whose perspective?) then what would
this say about the ethical position of the simulaters?  Further, if we
are merely playing the role of 'simple automata' then what is the
purpose (from the simulaters' viewpoint) of our *conscious* fears,
pains, loves, life struggle, and so forth?  Are these just an
unavoidable and unimportant (except to us) 'epiphenomenon' of the
simulation method?  Or are they what you mean by an 'interesting
pattern'?  Are we to take our creators' position as being 'superior' to
ours and if so what does this imply for our own (periodic) moral
delicacy about the rights and feelings of others - should we perhaps
view this as mere naivety or lack of intelligence in the light of our
masters' indifference to ours?

These are the issues I'm attempting to raise in the context of the
'simulation hypothesis'.  Of course, there's an aspect of this that
recapitulates the struggle throughout history to establish humane moral
criteria in the face of various arbitrary and omnipotent god-figures,
or for that matter 'blind necessity'.  Even in the teeth of your
creator, you are not forced to accept the justice of his position, even
as you bow to his overwhelming force, as Job shows us.

David






-- 
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.394 / Virus Database: 268.10.5/405 - Release Date: 01/08/2006




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-07 Thread David Nyman

Stathis Papaioannou wrote:

> Perhaps it says something about the nature of the simulation's creators,
> but I don't see that it says anything about the probability that we are
> living in one.

Do you mean that if we are living in one, then the moral standards of
its creators are reprehensible (to our way of thinking) or at least
opaque?  Would you feel that our believing this puts us in a position
in any way different from that of deists traditionally trying to fathom
what the standards or motives of their gods might be? In either case do
you have a view on how this could or should affect our own standards or
conduct within the simulation?

David

> David Nyman writes:
>
> > I don't know whether these issues have been given an airing here, but
> > I have a couple of thoughts about whether we're really 'in the
> > Matrix', a la Nick Bostrom.
> >
> > Firstly, a moral issue. At least at the level of public debate, in our
> > (apparent?) reality there is considerable sensitivity to interfering
> > with fundamental issues of human freedom and dignity, and of avoiding
> > where possible the infliction of unnecessary suffering, either to
> > humans or other sentient organisms.  It seems to me that if we are to
> > take seriously the idea that significant numbers of advanced
> > civilisations would 'simulate' us in the 'feelingful' way we
> > (or at least I) experience, that significant moral issues are raised.
> > These are not dissimilar to the paradoxes raised by the juxtaposition
> > of an all-loving and omnipotent God.  None of this is to claim a
> > knock-down argument, but nevertheless it places a constraint on the
> > kind of 'civilisation' that might undertake such an exercise,
> > especially in those scenarios that take it to be some sort of game or
> > entertainment.
>
> You're holding the beings running the simulation to awfully high standards.
> Humans have always persecuted their own kind, let alone other species,
> and have always managed to find rationalisations to explain why it isn't
> really "bad". Even if technologically superior alien societies have similar
> ethics to our own, by no means a given, what if our universe is being
> simulated by their equivalent of a psychopath, or even a teenager in his
> bedroom?
>
> > Secondly, what sort of role are 'we' supposed to playing?  On the
> > one hand, we may simply be required to play a part 'intelligently',
> > or at least predictably, for the benefit of the 'real' players.  In
> > this case, would they need to go to the trouble of making us
> > 'sentient'?  Or can we take this as evidence that the complexity
> > required for 'intelligence' simply gives rise to such sentience?
>
> I'd say that our sentience is a side-effect of our intelligence. Even if we
> are part of the simulation, the simulation seems to consistently follow
> evolutionary theory, and how or why would sentience develop if it were
> possible to have the same behaviour without it? I think this is a convincing
> argument against the existence of intelligent zombies.
>
> > Thirdly, is part of the point that 'they' share 'our'
> > experiences?  If so, what does this say about the supposedly privileged
> > relation between an individual and her experience?  Or is it just that
> > they get a third-party 'read-out' of our experiences?  Well, again,
> > would it then be necessary for us to go through the whole messy
> > business 'consciously' for such reporting to occur?
>
> Another reason why it appears that consciousness is a necessary side-effect
> of human level intelligent behaviour.
>
> > It seems to me that the above, and similar, considerations may act to
> > constrain the likelihood of there being such simulations, their nature,
> > or our 'actually' being in one, but I'm unable to say to what degree.
>
> Perhaps it says something about the nature of the simulation's creators,
> but I don't see that it says anything about the probability that we are
> living in one.
>
> Stathis Papaioannou
> _
> Be one of the first to try Windows Live Mail.
> http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



RE: The moral dimension of simulation

2006-08-06 Thread Stathis Papaioannou

David Nyman writes:

> I don't know whether these issues have been given an airing here, but
> I have a couple of thoughts about whether we're really 'in the
> Matrix', a la Nick Bostrom.
> 
> Firstly, a moral issue. At least at the level of public debate, in our
> (apparent?) reality there is considerable sensitivity to interfering
> with fundamental issues of human freedom and dignity, and of avoiding
> where possible the infliction of unnecessary suffering, either to
> humans or other sentient organisms.  It seems to me that if we are to
> take seriously the idea that significant numbers of advanced
> civilisations would 'simulate' us in the 'feelingful' way we
> (or at least I) experience, that significant moral issues are raised.
> These are not dissimilar to the paradoxes raised by the juxtaposition
> of an all-loving and omnipotent God.  None of this is to claim a
> knock-down argument, but nevertheless it places a constraint on the
> kind of 'civilisation' that might undertake such an exercise,
> especially in those scenarios that take it to be some sort of game or
> entertainment.

You're holding the beings running the simulation to awfully high standards. 
Humans have always persecuted their own kind, let alone other species, 
and have always managed to find rationalisations to explain why it isn't 
really "bad". Even if technologically superior alien societies have similar 
ethics to our own, by no means a given, what if our universe is being 
simulated by their equivalent of a psychopath, or even a teenager in his 
bedroom?

> Secondly, what sort of role are 'we' supposed to playing?  On the
> one hand, we may simply be required to play a part 'intelligently',
> or at least predictably, for the benefit of the 'real' players.  In
> this case, would they need to go to the trouble of making us
> 'sentient'?  Or can we take this as evidence that the complexity
> required for 'intelligence' simply gives rise to such sentience?

I'd say that our sentience is a side-effect of our intelligence. Even if we 
are part of the simulation, the simulation seems to consistently follow 
evolutionary theory, and how or why would sentience develop if it were 
possible to have the same behaviour without it? I think this is a convincing 
argument against the existence of intelligent zombies.

> Thirdly, is part of the point that 'they' share 'our'
> experiences?  If so, what does this say about the supposedly privileged
> relation between an individual and her experience?  Or is it just that
> they get a third-party 'read-out' of our experiences?  Well, again,
> would it then be necessary for us to go through the whole messy
> business 'consciously' for such reporting to occur?

Another reason why it appears that consciousness is a necessary side-effect 
of human level intelligent behaviour.

> It seems to me that the above, and similar, considerations may act to
> constrain the likelihood of there being such simulations, their nature,
> or our 'actually' being in one, but I'm unable to say to what degree.

Perhaps it says something about the nature of the simulation's creators, 
but I don't see that it says anything about the probability that we are 
living in one.

Stathis Papaioannou
_
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-06 Thread David Nyman

But your observation goes to the heart of my question.  If we were
indeed 'merely incidental' (from whose perspective?) then what would
this say about the ethical position of the simulaters?  Further, if we
are merely playing the role of 'simple automata' then what is the
purpose (from the simulaters' viewpoint) of our *conscious* fears,
pains, loves, life struggle, and so forth?  Are these just an
unavoidable and unimportant (except to us) 'epiphenomenon' of the
simulation method?  Or are they what you mean by an 'interesting
pattern'?  Are we to take our creators' position as being 'superior' to
ours and if so what does this imply for our own (periodic) moral
delicacy about the rights and feelings of others - should we perhaps
view this as mere naivety or lack of intelligence in the light of our
masters' indifference to ours?

These are the issues I'm attempting to raise in the context of the
'simulation hypothesis'.  Of course, there's an aspect of this that
recapitulates the struggle throughout history to establish humane moral
criteria in the face of various arbitrary and omnipotent god-figures,
or for that matter 'blind necessity'.  Even in the teeth of your
creator, you are not forced to accept the justice of his position, even
as you bow to his overwhelming force, as Job shows us.

David


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



RE: The moral dimension of simulation

2006-08-06 Thread Nick Prince

It could be that we are merely incidental to the purpose of the simulation.
In the game of life for example there are many interesting patterns which
come out of simple automata.  In the case of this game , AFAIK the only
purpose was to demonstrate the possibility of complexity from simplicity.

Nick Prince

-Original Message-
From: everything-list@googlegroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of David Nyman
Sent: 06 August 2006 19:43
To: Everything List
Subject: The moral dimension of simulation


I don't know whether these issues have been given an airing here, but
I have a couple of thoughts about whether we're really 'in the
Matrix', a la Nick Bostrom.

Firstly, a moral issue. At least at the level of public debate, in our
(apparent?) reality there is considerable sensitivity to interfering
with fundamental issues of human freedom and dignity, and of avoiding
where possible the infliction of unnecessary suffering, either to
humans or other sentient organisms.  It seems to me that if we are to
take seriously the idea that significant numbers of advanced
civilisations would 'simulate' us in the 'feelingful' way we
(or at least I) experience, that significant moral issues are raised.
These are not dissimilar to the paradoxes raised by the juxtaposition
of an all-loving and omnipotent God.  None of this is to claim a
knock-down argument, but nevertheless it places a constraint on the
kind of 'civilisation' that might undertake such an exercise,
especially in those scenarios that take it to be some sort of game or
entertainment.

Secondly, what sort of role are 'we' supposed to playing?  On the
one hand, we may simply be required to play a part 'intelligently',
or at least predictably, for the benefit of the 'real' players.  In
this case, would they need to go to the trouble of making us
'sentient'?  Or can we take this as evidence that the complexity
required for 'intelligence' simply gives rise to such sentience?

Thirdly, is part of the point that 'they' share 'our'
experiences?  If so, what does this say about the supposedly privileged
relation between an individual and her experience?  Or is it just that
they get a third-party 'read-out' of our experiences?  Well, again,
would it then be necessary for us to go through the whole messy
business 'consciously' for such reporting to occur?

It seems to me that the above, and similar, considerations may act to
constrain the likelihood of there being such simulations, their nature,
or our 'actually' being in one, but I'm unable to say to what degree.

Any thoughts?

David






-- 
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.394 / Virus Database: 268.10.5/405 - Release Date: 01/08/2006




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The moral dimension of simulation

2006-08-06 Thread Brent Meeker

David Nyman wrote:
> I don't know whether these issues have been given an airing here, but
> I have a couple of thoughts about whether we're really 'in the
> Matrix', a la Nick Bostrom.
> 
> Firstly, a moral issue. At least at the level of public debate, in our
> (apparent?) reality there is considerable sensitivity to interfering
> with fundamental issues of human freedom and dignity, and of avoiding
> where possible the infliction of unnecessary suffering, either to
> humans or other sentient organisms.  It seems to me that if we are to
> take seriously the idea that significant numbers of advanced
> civilisations would 'simulate' us in the 'feelingful' way we
> (or at least I) experience, that significant moral issues are raised.

A good point.  It was also raised by Stanislaw Lem in one of his Cyberiad
stories. Trurl is asked to solve the problem of a sadistic king who tortures
his subjects.  He does this by creating a simulated kingdom in which the
king can satisfy his sadistic urges by torturing simulated subjects.  But
when Klaupacius questions him about this, he discovers that Trurl has
provided the simulated subjects with feelings - since the king coulds not
enjoy tortuing beings that didn't "really" feel anything.  Klaupacius points
out that this is just as bad as before and Trurl has done an unethical thing
in creating this simulation.

Brent Meeker




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



The moral dimension of simulation

2006-08-06 Thread David Nyman

I don't know whether these issues have been given an airing here, but
I have a couple of thoughts about whether we're really 'in the
Matrix', a la Nick Bostrom.

Firstly, a moral issue. At least at the level of public debate, in our
(apparent?) reality there is considerable sensitivity to interfering
with fundamental issues of human freedom and dignity, and of avoiding
where possible the infliction of unnecessary suffering, either to
humans or other sentient organisms.  It seems to me that if we are to
take seriously the idea that significant numbers of advanced
civilisations would 'simulate' us in the 'feelingful' way we
(or at least I) experience, that significant moral issues are raised.
These are not dissimilar to the paradoxes raised by the juxtaposition
of an all-loving and omnipotent God.  None of this is to claim a
knock-down argument, but nevertheless it places a constraint on the
kind of 'civilisation' that might undertake such an exercise,
especially in those scenarios that take it to be some sort of game or
entertainment.

Secondly, what sort of role are 'we' supposed to playing?  On the
one hand, we may simply be required to play a part 'intelligently',
or at least predictably, for the benefit of the 'real' players.  In
this case, would they need to go to the trouble of making us
'sentient'?  Or can we take this as evidence that the complexity
required for 'intelligence' simply gives rise to such sentience?

Thirdly, is part of the point that 'they' share 'our'
experiences?  If so, what does this say about the supposedly privileged
relation between an individual and her experience?  Or is it just that
they get a third-party 'read-out' of our experiences?  Well, again,
would it then be necessary for us to go through the whole messy
business 'consciously' for such reporting to occur?

It seems to me that the above, and similar, considerations may act to
constrain the likelihood of there being such simulations, their nature,
or our 'actually' being in one, but I'm unable to say to what degree.

Any thoughts?

David


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---