On Tue, Jan 15, 2002 at 08:24:13PM -0700, Brent Meeker wrote:
I don't understand reason about your compassion. The point is that
you have a feeling about a possible future you imagine and so you take
action to avoid that future.
What I mean is that future should be the causal future of the
Wei Dai wrote:
Suppose someone offered you $1000, but if you accepted Earth
would be destroyed and everyone on it killed as soon as you die. Would you
take that offer? Even if you did I'm sure most people wouldn't.
This is because I include in the first person its possible
compassion feeling
Hello Wei
On 15-Jan-02, Wei Dai wrote:
On Tue, Jan 15, 2002 at 12:47:18PM +0100, Marchal wrote:
This is because I include in the first person its possible
compassion feeling for what are possible others. (This is
similar to what Brent Meeker said in its last post).
Compassion, although it
On 11-Jan-02, Wei Dai wrote:
I don't agree with this, because as I said earlier people expend effort to
obtain results that they'll never see, for example by writing wills.
Clearly what is a gain for a subject is not only based on first person
experiences. Suppose someone offered you $1000,
Wei writes:
Suppose you want to crack a bank's encryption key, which is worth $4
million to you, and there are two ways to do it. You can spend $2 million
to build a quantum computer to crack the key, or you can spend $3 million
to build a classical computer to do this. Now if you believe the
On Fri, Jan 11, 2002 at 04:59:47PM -0800, [EMAIL PROTECTED] wrote:
I'm having a lot of trouble understanding this view.
Thanks for taking the time to write the questions. I hope this response
helps.
Why should you care more or less about slow to compute universes?
I don't see any reason to
6 matches
Mail list logo