W. C. wrote:
>>From: Brent Meeker
>>...
>>But I like to eat. I like to eat steak. A world in which I can't eat
>>steak is not perfect for me.
>>
>>
>>>People with common intelligence can easily *imagine* (or dream) what a
>>
>>PU > will be.
>>
>>I guess I have uncommon intelligence :-) since
Stathis Papaioannou wrote:
> Perhaps it says something about the nature of the simulation's creators,
> but I don't see that it says anything about the probability that we are
> living in one.
Do you mean that if we are living in one, then the moral standards of
its creators are reprehensible (t
Le 05-août-06, à 17:03, David Nyman a écrit :
>
> Hi Bruno
>
> I think you're right about the complexity. It's because at this stage
> I'm just trying to discover whether this is a distinction that any of
> us think is true or useful, so I'm deliberately (but perhaps not always
> helpfully alas
Le 06-août-06, à 15:59, Stathis Papaioannou a écrit :
>
> Russell Standish writes:
>
>> This is one of those truly cracked ideas that is not wise to air in
>> polite company. Nevertheless, it can be fun to play around with in
>> this forum. I had a similarly cracked idea a few years ago about 1s
Le 07-août-06, à 01:44, W. C. a écrit :
>
>> From: Bruno Marchal
>> ...
>> But it is easy to explain that this is already a "simple" consequence
>> of
>> comp. Any piece of "matter" is the result of a sum on an infinity of
>> interfering computations: there is no reason to expect this to be
>>
George Levy wrote:
> I dont' really see any problem if we think of a conscious entity just
> like a proposition as information. Proposition p is information which
> can be either true or false. A conscious entity is also information.
> In this case, if the information is true then the entity
>From: Bruno Marchal
>...
>Comp says that there is a level of description of yourself such that you
>survive through an emulation done at that level. But the UD will simulate
>not only that level but all level belows. So comp makes the following
>prediction: if you look at yourself or at you ne
Le 07-août-06, à 15:52, W. C. a écrit :
>
>> From: Bruno Marchal
>> ...
>> Comp says that there is a level of description of yourself such that
>> you
>> survive through an emulation done at that level. But the UD will
>> simulate
>> not only that level but all level belows. So comp makes the
Bruno Marchal wrote:
> All right. (I hope you realize that you are very ambitious, but then
> that is how we learn).
Yes, learning is my aim here.
> My terminological problem here is that "experience" and "knowledge"
> are usually put in the "epistemology" instead of ontology. Of course I
> kn
Bruno Marchal wrote:
> Of course those physicist would believe in the wave collapse will have
> more reason than Everett followers to swallow what I say.
Not much more. Physical MWI is a materialist-contingent-empiricst
theory
and therefore just as much opposed to your
idealist-necessitarian-ra
George Levy wrote:
> A conscious entity is also information.
Really ? Why is that ?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-list@go
Bruno Marchal wrote:
> All right. (I hope you realize that you are very ambitious, but then
> that is how we learn).
Yes, learning is my aim here.
> My terminological problem here is that "experience" and "knowledge"
> are usually put in the "epistemology" instead of ontology. Of course I
> kn
1Z wrote:
George Levy wrote:
A conscious entity is also information.
I am assuming here that a conscious entity is essentially "software."
George
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Goog
David Nyman wrote:
> Bruno Marchal wrote:
>
> > All right. (I hope you realize that you are very ambitious, but then
> > that is how we learn).
>
> Yes, learning is my aim here.
>
> > My terminological problem here is that "experience" and "knowledge"
> > are usually put in the "epistemology" in
George Levy wrote:
> 1Z wrote:
>
> >George Levy wrote:
> >>A conscious entity is also information.
> I am assuming here that a conscious entity is essentially "software."
You can assume it of you like. It isn't computationalism, which
is the claim that congition is running a programme, not the
1Z wrote:
> > I'll try to nail this here. I take 'ontology' to refer to issues of
> > existence or being, where 'epistemology' refers to knowledge, or 'what
> > and how we know'. When I say that our 'ontology' is manifest, I'm
> > claiming (perhaps a little more cautiously than Descartes): 'I a
Dear David
Why is it so difficult to conceive that the simulators should be
unwittingly? Or in some way non ethical and thoughtless of the pain, fears,
loves etc of an interesting by product (or even possibly irritating by
product) of their simulation. Do you eat meat? Trap mice? kill flies? Wa
Nick Prince wrote:
> Who says morality to all other species is useful anyway (for survival) and
> even a defining feature of intelligent species? In war people kill people
> just like themselves, as long as they wear a different uniform! We drop atom
> bombs and say it was to save life!!(Hiroshi
>From: Bruno Marchal
>...
>Not at all. I mean it in the operational physical sense. Like observing
>your hand with a microscope, or looking closely to the "path" of an
>electron.
>...
Any microscope (optical or electron type)? What's the min. magnification &
resolution to see it?
I need to fin
John,
Perhaps I have misunderstood if you were presenting an alternative theory:
it's easy to misunderstand the often complex ideas discussed on this list.
Could
you explain your theory, and how it could be immune to being proved wrong?
Stathis Papaioannou
> Stathis,
> you (of all people) u
Russell Standish:
> On Sun, Aug 06, 2006 at 11:59:42PM +1000, Stathis Papaioannou wrote:
> > My thought was that if there are twice as many copies of you running in
> > parallel,
> > you are in a sense cramming twice as much experience into a given objective
> > time
> > period, so maybe this
Bruno Marchal writes (quoting SP):
> > ...a controlled
> > experiment in which measure can be turned up and down leaving
> > everything else
> > the same, such as having an AI running on several computers in perfect
> > lockstep.
>
>
> I think that the idea that a lower measure OM will appe
>From: Stathis Papaioannou
>
>...
>Classical teleportation cannot copy something exact to the quantum level,
>but rather involves making a "close enough" copy. It is obvious, I think,
>that this is theoretically possible, but it is not immediately obvious how
>good the copy of a person would ha
>From: W. C.
>
> >From: Bruno Marchal
> >...
> >Not at all. I mean it in the operational physical sense. Like observing
> >your hand with a microscope, or looking closely to the "path" of an
> >electron.
> >...
>
>Any microscope (optical or electron type)? What's the min. magnification &
>resol
David Nyman wrote:
> Stathis Papaioannou wrote:
>
>
>>Perhaps it says something about the nature of the simulation's creators,
>>but I don't see that it says anything about the probability that we are
>>living in one.
>
>
> Do you mean that if we are living in one, then the moral standards of
David Nyman writes:
> Stathis Papaioannou wrote:
>
> > Perhaps it says something about the nature of the simulation's creators,
> > but I don't see that it says anything about the probability that we are
> > living in one.
>
> Do you mean that if we are living in one, then the moral standards o
26 matches
Mail list logo