Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Eliezer S. Yudkowsky
Joshua Fox wrote: Turing also committed suicide. And Chislenko. Each of these people had different circumstances, and suicide strikes everywhere, but I wonder if there is a common thread. Ramanujan, like many other great mathematicians and achievers, died young. There are on the other hand

Re: [agi] Logical Satisfiability

2008-01-20 Thread Jim Bromer
I believe that a polynomial solution to the Logical Satisifiability problem will have a major effect on AI, and I would like to discuss that at sometime. Jim Bromer Richard Loosemore [EMAIL PROTECTED] wrote: This thread has nothing to do with artificial general intelligence: is there not a

Re: [agi] Logical Satisfiability

2008-01-20 Thread Jim Bromer
I had no idea what you were talking about until I read Matt Mahoney's remarks. I do not understand why people have so much trouble reading my messages but it is not entirely my fault. I may have misunderstood something that I read, or you may have misinterpreted something that I was saying.

Re: [agi] Logical Satisfiability

2008-01-20 Thread Vladimir Nesov
Jim, I'm sure most people here don't have any difficulty understanding what you are talking about. You seem to lack solid understanding of these basic issues however. Please stop this off-topic discussion, I'm sure you can find somewhere else to discuss computational complexity. Read a good

KILLTHREAD -- Re: [agi] Logical Satisfiability

2008-01-20 Thread Ben Goertzel
Hi all, I'd like to kill this thread, because not only is it off-topic, but it seems not to be going anywhere remotely insightful or progressive. Of course a polynomial-time solution to the boolean satisfiability problem could potentially have impact on AGI (though it wouldn't necessarily do so

Re: KILLTHREAD -- Re: [agi] Logical Satisfiability

2008-01-20 Thread Jim Bromer
I am disappointed because the question of how a polynomial time solution of logical satisfiability might affect agi is very important to me. Jim Bromer Ben Goertzel [EMAIL PROTECTED] wrote: Hi all, I'd like to kill this thread, because not only is it off-topic, but it seems not to be going

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Daniel Allen
Regarding the suicide rates of geniuses or those with high intelligence, I wouldn't be concerned: Berman says that the intelligence study is less useful than those that point to *risk factors like divorce or unemployment*. ''It's not as if I'm going to get more worried about my less

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Matt Mahoney
--- Mike Dougherty [EMAIL PROTECTED] wrote: On Jan 19, 2008 8:24 PM, Matt Mahoney [EMAIL PROTECTED] wrote: --- Eliezer S. Yudkowsky [EMAIL PROTECTED] wrote: http://www.wired.com/techbiz/people/magazine/16-02/ff_aimystery?currentPage=all Turing also committed suicide. That's a

Re: KILLTHREAD -- Re: [agi] Logical Satisfiability

2008-01-20 Thread Ben Goertzel
On Jan 20, 2008 2:34 PM, Jim Bromer [EMAIL PROTECTED] wrote: I am disappointed because the question of how a polynomial time solution of logical satisfiability might affect agi is very important to me. Well, feel free to start a new thread on that topic, then ;-) In fact, I will do so: I will

[agi] SAT, SMT and AGI

2008-01-20 Thread Ben Goertzel
I wrote On Jan 20, 2008 2:34 PM, Jim Bromer [EMAIL PROTECTED] wrote: I am disappointed because the question of how a polynomial time solution of logical satisfiability might affect agi is very important to me. Well, feel free to start a new thread on that topic, then ;-) In fact, I will

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Tyson
I believe that humans have the emotions that we do because of the environment we evolved in. The more selfish/fearful/emotional you are, the more likely you are to survive and reproduce. For humans, I think logic is a sort of tool used to help us achieve happiness. Happiness is the top-priority

Re: [agi] SAT, SMT and AGI

2008-01-20 Thread Vladimir Nesov
On Jan 21, 2008 1:35 AM, Ben Goertzel [EMAIL PROTECTED] wrote: However, I would rephrase the question as: How would a pragmatically useful polynomial time solution of logical satisfiability affect AGI? In fact, it's interesting to talk about how existing SAT and SMT solvers

Re: [agi] SAT, SMT and AGI

2008-01-20 Thread Pei Wang
On Jan 20, 2008 5:35 PM, Ben Goertzel [EMAIL PROTECTED] wrote: In AGI, we don't care that much about worst-case complexity, nor even necessarily about average-case complexity for very large N. We care mainly about average-case complexity for realistic N and for the specific probability

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Daniel Allen
Regarding AIG research as potentially psychologically disturbing, there are so many other ways to be pscyhologically disturbed in a postmodern world that it may not matter :) It's already hard for a lot of people to have a healthy level of self-esteem or self-indentity, and nihilism is not in

Re: [agi] SAT, SMT and AGI

2008-01-20 Thread Ben Goertzel
So, people do have a practically useful way of cheating problems in NP now. Problem with AGI is, we don't know how to program it even given computers with infinite computational power. Well, that is wrong IMO AIXI and the Godel Machine are provably correct ways to achieve AGI with

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Samantha Atkins
On Jan 19, 2008, at 5:24 PM, Matt Mahoney wrote: --- Eliezer S. Yudkowsky [EMAIL PROTECTED] wrote: http://www.wired.com/techbiz/people/magazine/16-02/ff_aimystery?currentPage=all Turing also committed suicide. In his case I understand that the British government saw fit to sentence