Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Niels-Jeroen Vandamme
A thermostat perceives the temperature and acts on it. Is it conscious? Registering does not equal perceiving. I mean subjective experience. We think we know what consciousness is. Actually, I'm quite aware that I don't. I find consciousness to be the greatest puzzle in the universe, but

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Stathis Papaioannou
On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote: When logic conflicts with instinct, instinct wins and the logic gets contorted. The heated discussion on the copy paradox is a perfect example. Your consciousness is tranferred to the copy only if the original is destroyed, or destroyed in

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Randall Randall
On Jun 28, 2007, at 12:56 PM, Charles D Hixson wrote: Stathis Papaioannou wrote: Yes, you would live on in one of the copies as if uploaded, and yes the selection of which copy would be purely random, dependent on the relative frequency of each copy (you can still define a measure to derive

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Niels-Jeroen Vandamme [EMAIL PROTECTED] wrote: From: Charles D Hixson [EMAIL PROTECTED] Reply-To: singularity@v2.listbox.com To: singularity@v2.listbox.com Subject: Re: [singularity] critiques of Eliezer's views on AI Date: Thu, 28 Jun 2007 09:56:12 -0700 Stathis Papaioannou wrote:

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
How do you get the 50% chance? There is a 100% chance of a mind waking up who has been uploaded, and also a 100% chance of a mind waking up who hasn't. This doesn't violate the laws of probability because these aren't mutually exclusive. Asking which one was you is silly, because we're assuming

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Randall Randall
On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote: How do you get the 50% chance? There is a 100% chance of a mind waking up who has been uploaded, and also a 100% chance of a mind waking up who hasn't. This doesn't violate the laws of probability because these aren't mutually exclusive. Asking

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Niels-Jeroen Vandamme [EMAIL PROTECTED] wrote: This is a textbook case of what Eliezer calls worshipping a sacred mystery. People tend to act like a theoretical problem is some kind of God, something above them in the social order, and since it's beaten others before you it would be

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Randall Randall [EMAIL PROTECTED] wrote: On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote: How do you get the 50% chance? There is a 100% chance of a mind waking up who has been uploaded, and also a 100% chance of a mind waking up who hasn't. This doesn't violate the laws of

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Randall Randall
On Jun 28, 2007, at 5:59 PM, Tom McCabe wrote: --- Randall Randall [EMAIL PROTECTED] wrote: On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote: How do you get the 50% chance? There is a 100% chance of a mind waking up who has been uploaded, and also a 100% chance of a mind waking up who hasn't.

[singularity] A modest proposal.

2007-06-28 Thread Alan Grimes
I'm sick and tired of all this endless debate on identity with regards to uploading. I suggest we allow everyone who wants to put their head in a meat-slicer do so and then simply walk over to the machine which is supposedly housing their consciousness, and flick the power switch... :) End of

Re: [singularity] A modest proposal.

2007-06-28 Thread Jey Kottalam
On 6/28/07, Alan Grimes [EMAIL PROTECTED] wrote: I'm sick and tired of all this endless debate on identity with regards to uploading. http://v2.listbox.com/member/unsubscribe/?email=agrimes%40speakeasy.netlist_id=11983action=unsubscribe -Jey Kottalam - This list is sponsored by AGIRI:

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote: So how do we approach the question of uploading without leading to a contradiction? I suggest we approach it in the context of outside observers simulating competing agents. How will

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Randall Randall
On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote: --- Stathis Papaioannou [EMAIL PROTECTED] wrote: How does this answer questions like, if I am destructively teleported to two different locations, what can I expect to experience? That's what I want to know before I press the button. You have

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Stathis Papaioannou
On 29/06/07, Charles D Hixson [EMAIL PROTECTED] wrote: Yes, you would live on in one of the copies as if uploaded, and yes the selection of which copy would be purely random, dependent on the relative frequency of each copy (you can still define a measure to derive probabilities even though

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Stathis Papaioannou
On 29/06/07, Niels-Jeroen Vandamme Personally, I do not believe in coincidence. Everything in the universe might seem stochastic, but it all has a logical explanation. I believe the same applies to quantum chaos, though quantum mechanics is still far too recondite for us to understand this

Re: Magickal consciousness stuff was Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Randall Randall [EMAIL PROTECTED] wrote: On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote: You're assuming again that consciousness is conserved. I have no idea why you think so. I would say that I think that each copy is conscious only of their own particular existence, and if that's

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 29/06/07, Charles D Hixson [EMAIL PROTECTED] wrote: Yes, you would live on in one of the copies as if uploaded, and yes the selection of which copy would be purely random, dependent on the relative frequency of each copy (you can

Re: [singularity] Previous message was a big hit, eh?

2007-06-28 Thread Jey Kottalam
On 6/28/07, Alan Grimes [EMAIL PROTECTED] wrote: I want to find people who are as entheuseastic as I am about such a future, people who are willing to spend hours each day trying to learn the skills required to develop a superhuman AI instead of wasting all that mental energy on the question of

Re: [singularity] Previous message was a big hit, eh?

2007-06-28 Thread LĂșcio de Souza Coelho
On 6/28/07, Alan Grimes [EMAIL PROTECTED] wrote: (...) Seriously now, Why do people insist there is a necessary connection (as in A implies B) between the singularity and brain uploading? Why is it that anyone who thinks the singularity happens and most people remain humanoid is automatically

Re: [singularity] Previous message was a big hit, eh?

2007-06-28 Thread Tom McCabe
--- Alan Grimes [EMAIL PROTECTED] wrote: ;) Seriously now, Why do people insist there is a necessary connection (as in A implies B) between the singularity and brain uploading? Why is it that anyone who thinks the singularity happens and most people remain humanoid is automatically

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 29/06/07, Niels-Jeroen Vandamme Personally, I do not believe in coincidence. Everything in the universe might seem stochastic, but it all has a logical explanation. I believe the same applies to quantum chaos, though quantum

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 29/06/07, Tom McCabe [EMAIL PROTECTED] wrote: I think it works better to look at it from the perspective of the guy doing the upload rather than the guy being uploaded. If you magically inserted yourself into the brain of a copy

Re: [singularity] Previous message was a big hit, eh?

2007-06-28 Thread Alan Grimes
If you're interested in talking about AGI, then talk about AGI (and preferrably on the agi list)! I'm looking forward to getting my copy of Conceptual Spaces: The Geometry of Thought by Peter Gaerdenfors tomorrow, but I don't think I'll have the time to read it soon... I'm too busy trying to

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Heartland
On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote: When logic conflicts with instinct, instinct wins and the logic gets contorted. The heated discussion on the copy paradox is a perfect example. Your consciousness is tranferred to the copy only if the original is destroyed, or destroyed in

Re: Magickal consciousness stuff was Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Tom McCabe
--- Randall Randall [EMAIL PROTECTED] wrote: On Jun 28, 2007, at 9:08 PM, Tom McCabe wrote: --- Randall Randall [EMAIL PROTECTED] wrote: On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote: You're assuming again that consciousness is conserved. I have no idea why you think so. I would say