Re: [agi] Pure reason is a disease.

2007-06-20 Thread YKY (Yan King Yin)
On 6/19/07, Eric Baum [EMAIL PROTECTED] wrote: The modern feature is that whole peoples have chosen to reproduce at half replacement level. In case you haven't thought about the implications of that, that means their genes, for example, are vanishing from the pool by a factor of 2 every 20

Re: [agi] Pure reason is a disease.

2007-06-20 Thread Eric Baum
YKY On 6/19/07, Eric Baum [EMAIL PROTECTED] wrote: The modern feature is that whole peoples have chosen to reproduce at half replacement level. In case you haven't thought about the implications of that, that means their genes, for example, are vanishing from the pool by a factor of 2 every

Re: [agi] Pure reason is a disease.

2007-06-19 Thread Eric Baum
Charles N.B.: People have practiced birth control as far back as we Charles have information. Look into the story of Oedipus Rex. Study Charles the histories of the Polynesians. The only modern feature is Charles that we are now allowing the practice to occur before the Charles investment in

Re: [agi] Pure reason is a disease.

2007-06-18 Thread Eric Baum
Eric Baum wrote: ... I claim that it is the very fact that you are making decisions about whether to supress pain for higher goals that is the reason you are conscious of pain. Your consciousness is the computation of a top-level decision making module (or perhaps system). If you were not

Re: [agi] Pure reason is a disease.

2007-06-17 Thread Eric Baum
Josh On Saturday 16 June 2007 07:20:27 pm Matt Mahoney wrote: --- Bo Morgan [EMAIL PROTECTED] wrote: I haven't kept up with this thread. But I wanted to counter the idea of a simple ordering of painfulness. Josh Can you give me an example? Josh Anyone who has played a

Re: [agi] Pure reason is a disease.

2007-06-17 Thread Mike Tintner
Eric: I claim that it is the very fact that you are making decisions about whether to supress pain for higher goals that is the reason you are conscious of pain. Your consciousness is the computation of a top-level decision making module (or perhaps system). If you were not making decisions

Re: [agi] Pure reason is a disease.

2007-06-17 Thread Eric Baum
I would claim that the specific nature of any quale, such as the various nuanced pain sensations, depends (in fact, is the same thing as) the code being run/ computation being performed when the quale is perceived. I therefor don't find it at all surprising that insects perceive pain

Re: [agi] Pure reason is a disease.

2007-06-17 Thread Eric Baum
The difference between nondeterministic computation and deterministic computation is a source of random numbers. Its a deep question in CS theory whether this makes any difference-- or whether you can simulate a nondeterministic computation using a pseudorandom number generator. The difference is

Re: [agi] Pure reason is a disease.

2007-06-17 Thread Charles D Hixson
Eric Baum wrote: Josh On Saturday 16 June 2007 07:20:27 pm Matt Mahoney wrote: --- Bo Morgan [EMAIL PROTECTED] wrote: ... ... I claim that it is the very fact that you are making decisions about whether to supress pain for higher goals that is the reason you are conscious of pain. Your

Re: [agi] Pure reason is a disease.

2007-06-17 Thread Joel Pitt
On 6/18/07, Charles D Hixson [EMAIL PROTECTED] wrote: Consider a terminal cancer patient. It's not the actual weighing that causes consciousness of pain, it's the implementation which normally allows such weighing. This, in my opinion, *is* a design flaw. Your original statement is a more

Re: [agi] Pure reason is a disease.

2007-06-16 Thread Eric Baum
Jiri, you are blind when it comes to my pain too. In fact, you are blind when it comes to many sensations within your own brain. Cut your corpus callosum, and the other half will have sensations that you are blind to. Do you think they are not there now, before you cut it? If you use your

Re: [agi] Pure reason is a disease.

2007-06-16 Thread Jiri Jelinek
Eric, I'm not 100% sure if someone/something else than me feels pain, but considerable similarities between my and other humans - architecture - [triggers of] internal and external pain related responses - independent descriptions of subjective pain perceptions which correspond in certain ways

Re: [agi] Pure reason is a disease.

2007-06-16 Thread Bo Morgan
I haven't kept up with this thread. But I wanted to counter the idea of a simple ordering of painfulness. A simple ordering of painfulness is one way to think about pain that might work in some simple systems, where resources are allocated in a serial fashion, but may not work in systems

Re: [agi] Pure reason is a disease.

2007-06-16 Thread Matt Mahoney
--- Bo Morgan [EMAIL PROTECTED] wrote: I haven't kept up with this thread. But I wanted to counter the idea of a simple ordering of painfulness. A simple ordering of painfulness is one way to think about pain that might work in some simple systems, where resources are allocated in a

Re: [agi] Pure reason is a disease.

2007-06-16 Thread J Storrs Hall, PhD
On Saturday 16 June 2007 07:20:27 pm Matt Mahoney wrote: --- Bo Morgan [EMAIL PROTECTED] wrote: I haven't kept up with this thread. But I wanted to counter the idea of a simple ordering of painfulness. Can you give me an example? Anyone who has played a competitive sport

Re: [agi] Pure reason is a disease.

2007-06-15 Thread Matt Mahoney
--- Lukasz Stafiniak [EMAIL PROTECTED] wrote: http://www.goertzel.org/books/spirit/uni3.htm -- VIRTUAL ETHICS The book chapter describes the need for ethics and cooperation in virtual worlds, but does not address the question of whether machines can feel pain. If you feel pain, you will insist

Re: [agi] Pure reason is a disease.

2007-06-15 Thread Jiri Jelinek
Eric, Right. IMO roughly the same problem when processed by a computer.. Why should you expect running a pain program on a computer to make you feel pain any more than when I feel pain? I don't. The thought was: If we don't feel pain when processing software in our pain-enabled minds, why

Re: [agi] Pure reason is a disease.

2007-06-15 Thread Eric Baum
Jiri Eric, Right. IMO roughly the same problem when processed by a computer.. Why should you expect running a pain program on a computer to make you feel pain any more than when I feel pain? Jiri I don't. The thought was: If we don't feel pain when processing Jiri software in our

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Mark Waser
.listbox.com Sent: Wednesday, June 13, 2007 6:26 PM Subject: Re: [agi] Pure reason is a disease. Mark, VNA..can simulate *any* substrate. I don't see any good reason for assuming that it would be anything more than a zombie. http://plato.stanford.edu/entries/zombies/ unless you believe

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Eric Baum
Jiri James, Frank Jackson (in Epiphenomenal Qualia) defined qualia Jiri as ...certain features of the bodily sensations especially, but Jiri also of certain perceptual experiences, which no amount of Jiri purely physical information includes.. :-) One of the biggest problems with the

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Eric Baum
Jiri Matt, Here is a program that feels pain. Jiri I got the logic, but no pain when processing the code in my Jiri mind. This is Frank Jackson's Mary fallacy, which I also debunk in WIT? Ch 14. Running similar code at a conscious level won't generate your sensation of pain because its not

Re: [agi] Pure reason is a disease.

2007-06-14 Thread James Ratcliff
Do you know those 10-15 mentioned hard items? I agree with your following thoughts on the matter. We have to seperate the mystical or spiritual from the physical, or determine for some reason that the physical is truly missing something, that there is something more than that is required for

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Eric Baum
James Do you know those 10-15 mentioned hard items? I agree with James your following thoughts on the matter. Actually, I saw a posting where you had the same (or at least a very similar) quote from Jackson, pain, itchiness, startling at loud noises, smelling rose, etc. - This list is

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Jiri Jelinek
Mark, Oh. You're stuck on qualia (and zombies) Sort of, but not really. There is no need for qualia in order to develop powerful AGI. I was just playing with some thoughts on potential security implications associated with the speculation of qualia being produced as a side-effect of certain

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Jiri Jelinek
James, determine for some reason that the physical is truly missing something Look at twin particles = just another example of something missing in the world as we can see it. Is it good enough to act and think and reason as if you have experienced the feeling. For AGI - yes. Why not (?).

Re: [agi] Pure reason is a disease.

2007-06-14 Thread J Storrs Hall, PhD
On Thursday 14 June 2007 07:19:18 am Mark Waser wrote: Oh. You're stuck on qualia (and zombies). I haven't seen a good compact argument to convince you (and e-mail is too low band-width and non-interactive to do one of the longer ones). My apologies. The best one-liner I know is,

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Mark Waser
I was just playing with some thoughts on potential security implications associated with the speculation of qualia being produced as a side-effect of certain algorithmic complexity on VNA. Which is, in many ways, pretty similar to my assumption that consciousness will be produced as a

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Eric Baum
Jiri Eric, Running similar code at a conscious level won't generate your ^^ The key word here was your. Jiri sensation of pain because its not called by the right routines Jiri and returning the

Re: [agi] Pure reason is a disease.

2007-06-14 Thread Lukasz Stafiniak
On 6/14/07, Matt Mahoney [EMAIL PROTECTED] wrote: I don't believe this addresses the issue of machine pain. Ethics is a complex function which evolves to increase the reproductive success of a society, for example, by banning sexual practices that don't lead to reproduction. Ethics also

Re: [agi] Pure reason is a disease.

2007-06-13 Thread James Ratcliff
Whihc compiler did you use for Human OS V1.0? Didnt realize we had a CPP compiler out alreadyh Jiri Jelinek [EMAIL PROTECTED] wrote: Matt, Here is a program that feels pain. I got the logic, but no pain when processing the code in my mind. Maybe you should mention in the pain.cpp

Re: [agi] Pure reason is a disease.

2007-06-13 Thread Matt Mahoney
--- James Ratcliff [EMAIL PROTECTED] wrote: Whihc compiler did you use for Human OS V1.0? Didnt realize we had a CPP compiler out alreadyh The purpose of my little pain-feeling program is to point out some of the difficulties in applying ethics-for-humans to machines. The program has

Re: [agi] Pure reason is a disease.

2007-06-13 Thread Lukasz Stafiniak
On 6/13/07, Matt Mahoney [EMAIL PROTECTED] wrote: If yes, then how do you define pain in a machine? A pain in a machine is the state in the machine that a person empathizing with the machine would avoid putting the machine into, other things being equal (that is, when there is no higher goal

Re: [agi] Pure reason is a disease.

2007-06-13 Thread Lukasz Stafiniak
On 6/13/07, Lukasz Stafiniak [EMAIL PROTECTED] wrote: On 6/13/07, Matt Mahoney [EMAIL PROTECTED] wrote: If yes, then how do you define pain in a machine? A pain in a machine is the state in the machine that a person empathizing with the machine would avoid putting the machine into, other

Re: [agi] Pure reason is a disease.

2007-06-13 Thread Jiri Jelinek
Mark, VNA..can simulate *any* substrate. I don't see any good reason for assuming that it would be anything more than a zombie. http://plato.stanford.edu/entries/zombies/ unless you believe that there is some other magic involved I would not call it magic, but we might have to look beyond

Re: [agi] Pure reason is a disease.

2007-06-13 Thread Matt Mahoney
--- Lukasz Stafiniak [EMAIL PROTECTED] wrote: On 6/13/07, Lukasz Stafiniak [EMAIL PROTECTED] wrote: On 6/13/07, Matt Mahoney [EMAIL PROTECTED] wrote: If yes, then how do you define pain in a machine? A pain in a machine is the state in the machine that a person empathizing with

Re: [agi] Pure reason is a disease.

2007-06-13 Thread Lukasz Stafiniak
On 6/14/07, Matt Mahoney [EMAIL PROTECTED] wrote: I would avoid deleting all the files on my hard disk, but it has nothing to do with pain or empathy. Let us separate the questions of pain and ethics. There are two independent questions. 1. What mental or computational states correspond to

Re: [agi] Pure reason is a disease.

2007-06-12 Thread Jiri Jelinek
Matt, Here is a program that feels pain. I got the logic, but no pain when processing the code in my mind. Maybe you should mention in the pain.cpp description that it needs to be processed for long enough - so whatever is gonna process it, it will eventually get to the 'I don't feel like

Re: [agi] Pure reason is a disease.

2007-06-11 Thread James Ratcliff
Two different responses to this type of arguement. Once you simulate something to the fact that we cant tell the difference between it in any way, then it IS that something for most all intents and purposes as far as the tests you have go. If it walks like a human, talks like a human, then for

Re: [agi] Pure reason is a disease.

2007-06-11 Thread Jiri Jelinek
James, Frank Jackson (in Epiphenomenal Qualia) defined qualia as ...certain features of the bodily sensations especially, but also of certain perceptual experiences, which no amount of purely physical information includes.. :-) If it walks like a human, talks like a human, then for all those

Re: [agi] Pure reason is a disease.

2007-06-11 Thread Matt Mahoney
Below is a program that can feel pain. It is a simulation of a programmable 2-input logic gate that you train using reinforcement conditioning. /* pain.cpp This program simulates a programmable 2-input logic gate. You train it by reinforcement conditioning. You provide a pair of input bits

RE: [agi] Pure reason is a disease.

2007-06-11 Thread Derek Zahn
Matt Mahoney writes: Below is a program that can feel pain. It is a simulation of a programmable 2-input logic gate that you train using reinforcement conditioning. Is it ethical to compile and run this program? - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe

Re: [agi] Pure reason is a disease.

2007-06-11 Thread Matt Mahoney
Here is a program that feels pain. It is a simulation of a 2-input logic gate that you train by reinforcement learning. It feels in the sense that it adjusts its behavior to avoid negative reinforcement from the user. /* pain.cpp - A program that can feel pleasure and pain. The program

Re: [agi] Pure reason is a disease.

2007-06-11 Thread J Storrs Hall, PhD
On Monday 11 June 2007 03:22:04 pm Matt Mahoney wrote: /* pain.cpp - A program that can feel pleasure and pain. ... Ouch! :-) Josh - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to:

RE: [agi] Pure reason is a disease.

2007-06-11 Thread Matt Mahoney
--- Derek Zahn [EMAIL PROTECTED] wrote: Matt Mahoney writes: Below is a program that can feel pain. It is a simulation of a programmable 2-input logic gate that you train using reinforcement conditioning. Is it ethical to compile and run this program? Well, that is a good question. Ethics

Re: [agi] Pure reason is a disease.

2007-06-11 Thread James Ratcliff
And here's the human psuedocode: 1. Hold Knife above flame until red. 2. Place knife on arm. 3. a. Accept Pain sensation b. Scream or respond as necessary 4. Press knife harder into skin. 5. Goto 3, until 6. 6. Pass out from pain Matt Mahoney [EMAIL PROTECTED] wrote: Below is a program

Re: [agi] Pure reason is a disease.

2007-06-10 Thread Jiri Jelinek
Mark, Could you specify some of those good reasons (i.e. why a sufficiently large/fast enough von Neumann architecture isn't sufficient substrate for a sufficiently complex mind to be conscious and feel -- or, at least, to believe itself to be conscious and believe itself to feel For being

Re: [agi] Pure reason is a disease.

2007-06-10 Thread Mark Waser
For feelings - like pain - there is a problem. But I don't feel like spending much time explaining it little by little through many emails. There are books and articles on this topic. Indeed there are and they are entirely unconvincing. Anyone who writes something can get it published. If

Re: [agi] Pure reason is a disease.

2007-06-07 Thread J Storrs Hall, PhD
Yep. It's clear that modelling others in a social context was at least one of the strong evolutionary drivers to human-level cognition. Reciprocal altruism (in, e.g. bats) is strongly correlated with increased brain size (compared to similar animals without it, e.g. other bats). It's clearly

Re: [agi] Pure reason is a disease.

2007-06-06 Thread Joel Pitt
On 6/3/07, Jiri Jelinek [EMAIL PROTECTED] wrote: Further, prove that pain (or more preferably sensation in general) isn't an emergent property of sufficient complexity. Talking about Neumann's architecture - I don't see how could increases in complexity of rules used for switching Boolean

Re: [agi] Pure reason is a disease.

2007-06-06 Thread Samantha  Atkins
On Jun 5, 2007, at 9:17 AM, J Storrs Hall, PhD wrote: On Tuesday 05 June 2007 10:51:54 am Mark Waser wrote: It's my belief/contention that a sufficiently complex mind will be conscious and feel -- regardless of substrate. Sounds like Mike the computer in Moon is a Harsh Mistress

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
Your brain can be simulated on a large/fast enough von Neumann architecture. From the behavioral perspective (which is good enough for AGI) - yes, but that's not the whole story when it comes to human brain. In our brains, information not only is and moves but also feels. It's my

Re: [agi] Pure reason is a disease.

2007-06-05 Thread James Ratcliff
To get any further with feelings you again have to have a better definition and examples of what you are dealing with. In humans, most feelings and emotions are brought about by chemical changes in the body yes? Then from there it becomes knowledge in the brain, which we use to make decisions

Re: [agi] Pure reason is a disease.

2007-06-05 Thread J Storrs Hall, PhD
On Tuesday 05 June 2007 10:51:54 am Mark Waser wrote: It's my belief/contention that a sufficiently complex mind will be conscious and feel -- regardless of substrate. Sounds like Mike the computer in Moon is a Harsh Mistress (Heinlein). Note, btw, that Mike could be programmed in Loglan

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
I think a system can get arbitrarily complex without being conscious -- consciousness is a specific kind of model-based, summarizing, self-monitoring architecture. Yes. That is a good clarification of what I meant rather than what I said. That said, I think consciousness is necessary but

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Jef Allbright
On 6/5/07, Mark Waser [EMAIL PROTECTED] wrote: I think a system can get arbitrarily complex without being conscious -- consciousness is a specific kind of model-based, summarizing, self-monitoring architecture. Yes. That is a good clarification of what I meant rather than what I said.

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
Isn't it indisputable that agency is necessarily on behalf of some perceived entity (a self) and that assessment of the morality of any decision is always only relative to a subjective model of rightness? I'm not sure that I should dive into this but I'm not the brightest sometimes . . . . :-)

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Jef Allbright
On 6/5/07, Mark Waser [EMAIL PROTECTED] wrote: Isn't it indisputable that agency is necessarily on behalf of some perceived entity (a self) and that assessment of the morality of any decision is always only relative to a subjective model of rightness? I'm not sure that I should dive into

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
I do think its a misuse of agency to ascribe moral agency to what is effectively only a tool. Even a human, operating under duress, i.e. as a tool for another, should be considered as having diminished or no moral agency, in my opinion. So, effectively, it sounds like agency requires both

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Jef Allbright
On 6/5/07, Mark Waser [EMAIL PROTECTED] wrote: I do think its a misuse of agency to ascribe moral agency to what is effectively only a tool. Even a human, operating under duress, i.e. as a tool for another, should be considered as having diminished or no moral agency, in my opinion. So,

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Jef Allbright
On 6/5/07, Mark Waser [EMAIL PROTECTED] wrote: I would not claim that agency requires consciousness; it is necessary only that an agent acts on its environment so as to minimize the difference between the external environment and its internal model of the preferred environment OK. Moral

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
? Or are they not moral since they're not conscious decisions at the time of choice?:-). Mark - Original Message - From: Jef Allbright [EMAIL PROTECTED] To: agi@v2.listbox.com Sent: Tuesday, June 05, 2007 5:45 PM Subject: Re: [agi] Pure reason is a disease. On 6/5/07, Mark Waser

RE: [agi] Pure reason is a disease.

2007-06-05 Thread Derek Zahn
Mark Waser writes: BTW, with this definition of morality, I would argue that it is a very rare human that makes moral decisions any appreciable percent of the time Just a gentle suggestion: If you're planning to unveil a major AGI initiative next month, focus on that at the moment.

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
Just a gentle suggestion: If you're planning to unveil a major AGI initiative next month, focus on that at the moment. I think that morality (aka Friendliness) is directly on-topic for *any* AGI initiative; however, it's actually even more apropos for the approach that I'm taking. As I

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Mark Waser
Decisions are seen as increasingly moral to the extent that they enact principles assessed as promoting an increasing context of increasingly coherent values over increasing scope of consequences. Or another question . . . . if I'm analyzing an action based upon the criteria specified above

Re: [agi] Pure reason is a disease.

2007-06-05 Thread Jef Allbright
On 6/5/07, Mark Waser [EMAIL PROTECTED] wrote: Decisions are seen as increasingly moral to the extent that they enact principles assessed as promoting an increasing context of increasingly coherent values over increasing scope of consequences. Or another question . . . . if I'm analyzing

RE: [agi] Pure reason is a disease.

2007-06-05 Thread Derek Zahn
Mark Waser writes: I think that morality (aka Friendliness) is directly on-topic for *any* AGI initiative; however, it's actually even more apropos for the approach that I'm taking. A very important part of what I'm proposing is attempting to deal with the fact that no two humans agree

Re: [agi] Pure reason is a disease.

2007-06-04 Thread Jiri Jelinek
Hi Mark, Your brain can be simulated on a large/fast enough von Neumann architecture. From the behavioral perspective (which is good enough for AGI) - yes, but that's not the whole story when it comes to human brain. In our brains, information not only is and moves but also feels. From my

Re: [agi] Pure reason is a disease.

2007-06-02 Thread Mark Waser
What component do you have that can't exist in a von Neumann architecture? Brain :) Your brain can be simulated on a large/fast enough von Neumann architecture. Agreed, your PC cannot feel pain. Are you sure, however, that an entity hosted/simulated on your PC doesn't/can't? If the

Re: [agi] Pure reason is a disease.

2007-05-26 Thread Jiri Jelinek
Mark, If Google came along and offered you $10 million for your AGI, would you give it to them? No, I would sell services. How about the Russian mob for $1M and your life and the lives of your family? How about FBI? No? So maybe selling him a messed up version for $2M and then hiring a

Re: [agi] Pure reason is a disease.

2007-05-26 Thread Mark Waser
: Jiri Jelinek [EMAIL PROTECTED] To: agi@v2.listbox.com Sent: Saturday, May 26, 2007 4:20 AM Subject: Re: [agi] Pure reason is a disease. Mark, If Google came along and offered you $10 million for your AGI, would you give it to them? No, I would sell services. How about the Russian mob for $1M

Re: [agi] Pure reason is a disease.

2007-05-26 Thread Mark Waser
I think it is a serious mistake for anyone to say that the difference between machines cannot in principle experience real feelings. We are complex machines, so yes, machines can, but my PC cannot, even though it can power AGI. Agreed, your PC cannot feel pain. Are you sure, however, that an

Re: [agi] Pure reason is a disease.

2007-05-25 Thread Mark Waser
You possibly already know this and are simplifying for the sake of simplicity, but chemicals are not simply global environmental settings. Chemicals/hormones/peptides etc. are spatial concentration gradients across the entire brain, which are much more difficult to emulate in software then a

Re: [agi] Pure reason is a disease.

2007-05-24 Thread Jiri Jelinek
Mark, I cannot hit everything now, so at least one part: Are you *absolutely positive* that real pain and real feelings aren't an emergent phenomenon of sufficiently complicated and complex feedback loops? Are you *really sure* that a sufficiently sophisticated AGI won't experience pain?

Re: [agi] Pure reason is a disease.

2007-05-24 Thread Eric Baum
Josh I think that people have this notion that because emotions are Josh so unignorable and compelling subjectively, that they must be Josh complex. In fact the body's contribution, in an information Josh theoretic sense, is tiny -- I'm sure I way overestimate it with Josh the 1%. Emotions are

Re: [agi] Pure reason is a disease.

2007-05-24 Thread Mark Waser
Note that some people suffer from rare disorders that prevent them from the sensation of pain (e.g. congenital insensitivity to pain). the pain info doesn't even make it to the brain because of malfunctioning nerve cells which are responsible for transmitting the pain signals (caused by

Re: [agi] Pure reason is a disease.

2007-05-24 Thread Eric Baum
Jiri Note that some people suffer from rare Jiri disorders that prevent them from the sensation of pain Jiri (e.g. congenital insensitivity to pain). What that tells you is that the sensation you feel is genetically programmed. Break the program, you break (or change) the sensation. Run the

Re: [agi] Pure reason is a disease.

2007-05-24 Thread Joel Pitt
On 5/25/07, Mark Waser [EMAIL PROTECTED] wrote: Sophisticated logical structures (at least in our bodies) are not enough for actual feelings. For example, to feel pleasure, you also need things like serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and endorphins. Worlds of

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Richard Loosemore
Mark Waser wrote: AGIs (at least those that could run on current computers) cannot really get excited about anything. It's like when you represent the pain intensity with a number. No matter how high the number goes, it doesn't really hurt. Real feelings - that's the key difference between us

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Lukasz Kaiser
Hi, On 5/23/07, Mark Waser [EMAIL PROTECTED] wrote: - Original Message - From: Jiri Jelinek [EMAIL PROTECTED] On 5/20/07, Mark Waser [EMAIL PROTECTED] wrote: - Original Message - From: Jiri Jelinek [EMAIL PROTECTED] On 5/16/07, Mark Waser [EMAIL PROTECTED] wrote: -

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Mark Waser
A meta-question here with some prefatory information . . . . The reason why I top-post (and when I do so, I *never* put content inside) is because I frequently find it *really* convenient to have the entire text of the previous message or two (no more) immediately available for reference.

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Eric Baum
Richard Mark Waser wrote: AGIs (at least those that could run on current computers) cannot really get excited about anything. It's like when you Richard represent the pain intensity with a number. No matter how high the number Richard goes, it doesn't really hurt. Real feelings - that's the

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Eric Baum
AGIs (at least those that could run on current computers) cannot really get excited about anything. It's like when you represent the pain intensity with a number. No matter how high the number goes, it doesn't really hurt. Real feelings - that's the key difference between us and them and the

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Eric Baum
Mike Eric Baum: What is Thought [claims that] feelings.are Mike explainable by a computational model. Mike Feelings/ emotions are generated by the brain's computations, Mike certainly. But they are physical/ body events. Does your Turing Mike machine have a body other than that of some kind of

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Mike Tintner
P.S. Eric, I haven't forgotten your question to me, will try to address it in time - the answer is complex. - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to:

Re: [agi] Pure reason is a disease.

2007-05-23 Thread Mike Tintner
Eric, The point is simply that you can only fully simulate emotions with a body as well as a brain. And emotions while identified by the conscious brain are felt with the body I don't find it at all hard to understand - I fully agree - that emotions are generated as a result of

Re: [agi] Pure reason is a disease.

2007-05-23 Thread J Storrs Hall, PhD
On Wednesday 23 May 2007 06:34:29 pm Mike Tintner wrote: My underlying argument, though, is that your (or any) computational model of emotions, if it does not also include a body, will be fundamentally flawed both physically AND computationally. Does everyone here know what an ICE is in

Re: [agi] Pure reason is a disease.

2007-05-20 Thread Jiri Jelinek
@v2.listbox.com Sent: Wednesday, May 16, 2007 2:18 AM Subject: Re: [agi] Pure reason is a disease. Mark, In computer systems, searches are much cleaner so the backup search functionality typically doesn't make sense. ..I entirely disagree... searches are not simple enough that you can count

Re: [agi] Pure reason is a disease.

2007-05-20 Thread Mark Waser
- Original Message - From: Jiri Jelinek [EMAIL PROTECTED] To: agi@v2.listbox.com Sent: Sunday, May 20, 2007 4:14 AM Subject: Re: [agi] Pure reason is a disease. Hi Mark, AGI(s) suggest solutions people decide what to do. 1. People are stupid and will often decide to do things

Re: [agi] Pure reason is a disease.

2007-05-16 Thread Jiri Jelinek
building something as dangerous as an entity that will eventually be more powerful than us. Mark - Original Message - From: Jiri Jelinek To: agi@v2.listbox.com Sent: Thursday, May 03, 2007 1:11 PM Subject: Re: [agi] Pure reason is a disease. Mark, relying on the fact that you

Re: [agi] Pure reason is a disease.

2007-05-16 Thread Mark Waser
.listbox.com Sent: Wednesday, May 16, 2007 2:18 AM Subject: Re: [agi] Pure reason is a disease. Mark, In computer systems, searches are much cleaner so the backup search functionality typically doesn't make sense. ..I entirely disagree... searches are not simple enough that you can count

Re: [agi] Pure reason is a disease.

2007-05-03 Thread Mark Waser
- From: Jiri Jelinek To: agi@v2.listbox.com Sent: Thursday, May 03, 2007 1:57 AM Subject: Re: [agi] Pure reason is a disease. Mark, logic, when it relies upon single chain reasoning is relatively fragile. And when it rests upon bad assumptions, it can be just a roadmap to disaster

Re: [agi] Pure reason is a disease.

2007-05-02 Thread Mark Waser
for this as well in our perception of and stories about emotionless people. Mark P.S. Great discussion. Thank you. - Original Message - From: Jiri Jelinek To: agi@v2.listbox.com Sent: Tuesday, May 01, 2007 6:21 PM Subject: Re: [agi] Pure reason is a disease. Mark

Re: [agi] Pure reason is a disease.

2007-05-02 Thread Mark Waser
effects -- which is, of course, a very dangerous assumption). - Original Message - From: Jiri Jelinek To: agi@v2.listbox.com Sent: Tuesday, May 01, 2007 6:21 PM Subject: Re: [agi] Pure reason is a disease. Mark, I understand your point but have an emotional/ethical problem

Re: [agi] Pure reason is a disease.

2007-05-02 Thread Eric Baum
My point, in that essay, is that the nature of human emotions is rooted in the human brain architecture, Mark I'll agree that human emotions are rooted in human brain Mark architecture but there is also the question -- is there Mark something analogous to emotion which is generally

Re: [agi] Pure reason is a disease.

2007-05-02 Thread Mark Waser
/selected through the force of evolution -- and it's hard to argue with long-term evolution. - Original Message - From: Eric Baum [EMAIL PROTECTED] To: agi@v2.listbox.com Sent: Wednesday, May 02, 2007 11:04 AM Subject: Re: [agi] Pure reason is a disease. My point, in that essay

Re: [agi] Pure reason is a disease.

2007-05-02 Thread Jiri Jelinek
of and stories about emotionless people. Mark P.S. Great discussion. Thank you. - Original Message - *From:* Jiri Jelinek [EMAIL PROTECTED] *To:* agi@v2.listbox.com *Sent:* Tuesday, May 01, 2007 6:21 PM *Subject:* Re: [agi] Pure reason is a disease. Mark, I understand your point but have

[agi] Pure reason is a disease.

2007-05-01 Thread Mark Waser
From the Boston Globe (http://www.boston.com/news/education/higher/articles/2007/04/29/hearts__minds/?page=full) Antonio Damasio, a neuroscientist at USC, has played a pivotal role in challenging the old assumptions and establishing emotions as an important scientific subject. When Damasio

Re: [agi] Pure reason is a disease.

2007-05-01 Thread Benjamin Goertzel
Well, this tells you something interesting about the human cognitive architecture, but not too much about intelligence in general... I think the dichotomy btw feeling and thinking is a consequence of the limited reflective capabilities of the human brain... I wrote about this in The Hidden

Re: [agi] Pure reason is a disease.

2007-05-01 Thread Mark Waser
give reasons why I believe it has happened this way) but which, in a ideal world/optimized entity, would be continuous. - Original Message - From: Benjamin Goertzel To: agi@v2.listbox.com Sent: Tuesday, May 01, 2007 11:05 AM Subject: Re: [agi] Pure reason is a disease. Well

Re: [agi] Pure reason is a disease.

2007-05-01 Thread Benjamin Goertzel
On 5/1/07, Mark Waser [EMAIL PROTECTED] wrote: Well, this tells you something interesting about the human cognitive architecture, but not too much about intelligence in general... How do you know that it doesn't tell you much about intelligence in general? That was an incredibly dismissive

  1   2   >