Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Jiri Jelinek
Matt, Printing ahh or ouch is just for show. The important observation is that the program changes its behavior in response to a reinforcement signal in the same way that animals do. Let me remind you that the problem we were originally discussing was about qualia and uploading. Not just about a

RE: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Gary Miller
. -Original Message- From: Matt Mahoney [mailto:[EMAIL PROTECTED] Sent: Sunday, November 18, 2007 5:32 PM To: agi@v2.listbox.com Subject: Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!) --- Jiri Jelinek [EMAIL PROTECTED] wrote: Matt, autobliss passes tests

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Matt Mahoney
--- Jiri Jelinek [EMAIL PROTECTED] wrote: Matt, Printing ahh or ouch is just for show. The important observation is that the program changes its behavior in response to a reinforcement signal in the same way that animals do. Let me remind you that the problem we were originally

RE: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Matt Mahoney
--- Gary Miller [EMAIL PROTECTED] wrote: Too complicate things further. A small percentage of humans perceive pain as pleasure and prefer it at least in a sexual context or else fetishes like sadomachism would not exist. And they do in fact experience pain as a greater pleasure. More

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-17 Thread Dennis Gorelik
Matt, You algorithm is too complex. What's the point of doing step 1? Step 2 is sufficient. Saturday, November 3, 2007, 8:01:45 PM, you wrote: So we can dispense with the complex steps of making a detailed copy of your brain and then have it transition into a degenerate state, and just skip

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-17 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Jiri Jelinek [EMAIL PROTECTED] wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote: We just need to control AGIs goal system.

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-17 Thread Jiri Jelinek
Matt, autobliss passes tests for awareness of its inputs and responds as if it has qualia. How is it fundamentally different from human awareness of pain and pleasure, or is it just a matter of degree? If your code has feelings it reports then reversing the order of the feeling strings (without

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-14 Thread Richard Loosemore
Matt Mahoney wrote: --- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Jiri Jelinek [EMAIL PROTECTED] wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote: We just need to control AGIs goal system. You can only control the goal system of the first

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-13 Thread Matt Mahoney
--- Jiri Jelinek [EMAIL PROTECTED] wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote: We just need to control AGIs goal system. You can only control the goal system of the first iteration. ..and you can add rules for it's creations (e.g. stick with the same

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-13 Thread Richard Loosemore
Matt Mahoney wrote: --- Jiri Jelinek [EMAIL PROTECTED] wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote: We just need to control AGIs goal system. You can only control the goal system of the first iteration. ..and you can add rules for it's creations (e.g. stick with

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-13 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Jiri Jelinek [EMAIL PROTECTED] wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote: We just need to control AGIs goal system. You can only control the goal system of the first iteration. ..and

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-12 Thread Jiri Jelinek
On Nov 11, 2007 5:39 PM, Matt Mahoney [EMAIL PROTECTED] wrote: We just need to control AGIs goal system. You can only control the goal system of the first iteration. ..and you can add rules for it's creations (e.g. stick with the same goals/rules unless authorized otherwise) But if

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-06 Thread Bob Mottram
I've often heard people say things like qualia are an illusion or consciousness is just an illusion, but the concept of an illusion when applied to the mind is not very helpful, since all our thoughts and perceptions could be considered as illusions reconstructed from limited sensory data and

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-05 Thread Jiri Jelinek
Matt, We can compute behavior, but nothing indicates we can compute feelings. Qualia research needed to figure out new platforms for uploading. Regards, Jiri Jelinek On Nov 4, 2007 1:15 PM, Matt Mahoney [EMAIL PROTECTED] wrote: --- Jiri Jelinek [EMAIL PROTECTED] wrote: Matt, Create a

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-04 Thread Matt Mahoney
--- Jiri Jelinek [EMAIL PROTECTED] wrote: Matt, Create a numeric pleasure variable in your mind, initialize it with a positive number and then keep doubling it for some time. Done? How do you feel? Not a big difference? Oh, keep doubling! ;-)) The point of autobliss.cpp is to illustrate

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-04 Thread Russell Wallace
On 11/4/07, Matt Mahoney [EMAIL PROTECTED] wrote: Let's say your goal is to stimulate your nucleus accumbens. (Everyone has this goal; they just don't know it). The problem is that you would forgo food, water, and sleep until you died (we assume, from animal experiments). We have no need to

Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-03 Thread Matt Mahoney
--- Edward W. Porter [EMAIL PROTECTED] wrote: If bliss without intelligence is the goal of the machines you imaging running the world, for the cost of supporting one human they could probably keep at least 100 mice in equal bliss, so if they were driven to maximize bliss why wouldn't they kill