A thermostat perceives the temperature and acts on it. Is it conscious?
Registering does not equal perceiving. I mean subjective experience.
We think we know what consciousness is.
Actually, I'm quite aware that I don't. I find consciousness to be the
greatest puzzle in the universe, but
On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote:
When logic conflicts with instinct, instinct wins and the logic gets
contorted. The heated discussion on the copy paradox is a perfect example.
Your consciousness is tranferred to the copy only if the original is
destroyed, or destroyed in
On Jun 28, 2007, at 12:56 PM, Charles D Hixson wrote:
Stathis Papaioannou wrote:
Yes, you would live on in one of the copies as if uploaded, and yes
the selection of which copy would be purely random, dependent on the
relative frequency of each copy (you can still define a measure to
derive
--- Niels-Jeroen Vandamme
[EMAIL PROTECTED] wrote:
From: Charles D Hixson [EMAIL PROTECTED]
Reply-To: singularity@v2.listbox.com
To: singularity@v2.listbox.com
Subject: Re: [singularity] critiques of Eliezer's
views on AI
Date: Thu, 28 Jun 2007 09:56:12 -0700
Stathis Papaioannou wrote:
How do you get the 50% chance? There is a 100%
chance of a mind waking up who has been uploaded, and
also a 100% chance of a mind waking up who hasn't.
This doesn't violate the laws of probability because
these aren't mutually exclusive. Asking which one was
you is silly, because we're assuming
On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote:
How do you get the 50% chance? There is a 100%
chance of a mind waking up who has been uploaded, and
also a 100% chance of a mind waking up who hasn't.
This doesn't violate the laws of probability because
these aren't mutually exclusive. Asking
--- Niels-Jeroen Vandamme
[EMAIL PROTECTED] wrote:
This is a textbook case of what Eliezer calls
worshipping a sacred mystery. People tend to act
like a theoretical problem is some kind of God,
something above them in the social order, and since
it's beaten others before you it would be
--- Randall Randall [EMAIL PROTECTED]
wrote:
On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote:
How do you get the 50% chance? There is a 100%
chance of a mind waking up who has been uploaded,
and
also a 100% chance of a mind waking up who hasn't.
This doesn't violate the laws of
On Jun 28, 2007, at 5:59 PM, Tom McCabe wrote:
--- Randall Randall [EMAIL PROTECTED]
wrote:
On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote:
How do you get the 50% chance? There is a 100%
chance of a mind waking up who has been uploaded,
and
also a 100% chance of a mind waking up who hasn't.
I'm sick and tired of all this endless debate on identity with regards
to uploading.
I suggest we allow everyone who wants to put their head in a meat-slicer
do so and then simply walk over to the machine which is supposedly
housing their consciousness, and flick the power switch... :)
End of
On 6/28/07, Alan Grimes [EMAIL PROTECTED] wrote:
I'm sick and tired of all this endless debate on identity with regards
to uploading.
http://v2.listbox.com/member/unsubscribe/?email=agrimes%40speakeasy.netlist_id=11983action=unsubscribe
-Jey Kottalam
-
This list is sponsored by AGIRI:
--- Stathis Papaioannou [EMAIL PROTECTED] wrote:
On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote:
So how do we approach the question of uploading without leading to a
contradiction? I suggest we approach it in the context of outside
observers
simulating competing agents. How will
On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote:
--- Stathis Papaioannou [EMAIL PROTECTED] wrote:
How does this answer questions like, if I am destructively teleported
to two different locations, what can I expect to experience? That's
what I want to know before I press the button.
You have
On 29/06/07, Charles D Hixson [EMAIL PROTECTED] wrote:
Yes, you would live on in one of the copies as if uploaded, and yes
the selection of which copy would be purely random, dependent on the
relative frequency of each copy (you can still define a measure to
derive probabilities even though
On 29/06/07, Niels-Jeroen Vandamme
Personally, I do not believe in coincidence. Everything in the universe
might seem stochastic, but it all has a logical explanation. I believe the
same applies to quantum chaos, though quantum mechanics is still far too
recondite for us to understand this
--- Randall Randall [EMAIL PROTECTED]
wrote:
On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote:
You're assuming again that consciousness is
conserved.
I have no idea why you think so. I would say that
I think that each copy is conscious only of their
own particular existence, and if that's
--- Stathis Papaioannou [EMAIL PROTECTED] wrote:
On 29/06/07, Charles D Hixson
[EMAIL PROTECTED] wrote:
Yes, you would live on in one of the copies as
if uploaded, and yes
the selection of which copy would be purely
random, dependent on the
relative frequency of each copy (you can
On 6/28/07, Alan Grimes [EMAIL PROTECTED] wrote:
I want to find people who are as entheuseastic as I am about such a
future, people who are willing to spend hours each day trying to learn
the skills required to develop a superhuman AI instead of wasting all
that mental energy on the question of
On 6/28/07, Alan Grimes [EMAIL PROTECTED] wrote:
(...)
Seriously now, Why do people insist there is a necessary connection (as
in A implies B) between the singularity and brain uploading?
Why is it that anyone who thinks the singularity happens and most
people remain humanoid is automatically
--- Alan Grimes [EMAIL PROTECTED] wrote:
;)
Seriously now, Why do people insist there is a
necessary connection (as
in A implies B) between the singularity and brain
uploading?
Why is it that anyone who thinks the singularity
happens and most
people remain humanoid is automatically
--- Stathis Papaioannou [EMAIL PROTECTED] wrote:
On 29/06/07, Niels-Jeroen Vandamme
Personally, I do not believe in coincidence.
Everything in the universe
might seem stochastic, but it all has a logical
explanation. I believe the
same applies to quantum chaos, though quantum
--- Stathis Papaioannou [EMAIL PROTECTED] wrote:
On 29/06/07, Tom McCabe [EMAIL PROTECTED]
wrote:
I think
it works better to look at it from the perspective
of
the guy doing the upload rather than the guy being
uploaded. If you magically inserted yourself into
the
brain of a copy
If you're interested in talking about AGI, then talk about AGI (and
preferrably on the agi list)! I'm looking forward to getting my copy
of Conceptual Spaces: The Geometry of Thought by Peter Gaerdenfors
tomorrow, but I don't think I'll have the time to read it soon... I'm
too busy trying to
On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote:
When logic conflicts with instinct, instinct wins and the logic gets
contorted. The heated discussion on the copy paradox is a perfect example.
Your consciousness is tranferred to the copy only if the original is
destroyed, or destroyed in
--- Randall Randall [EMAIL PROTECTED]
wrote:
On Jun 28, 2007, at 9:08 PM, Tom McCabe wrote:
--- Randall Randall [EMAIL PROTECTED]
wrote:
On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote:
You're assuming again that consciousness is
conserved.
I have no idea why you think so. I would say
25 matches
Mail list logo