Re: [singularity] Is Friendly AI Bunk?

2006-09-11 Thread Ben Goertzel
Hi, Just for kicks - let's assume that AIXItl yields 1% more intelligent results when provided 10^6 times the computational resources when compared to another algorythm X. Let's further assume that today the cost asscociated with X for reaching a benefit of 1 will be 1 compared to a cost of

Re: [singularity] Is Friendly AI Bunk?

2006-09-11 Thread Ben Goertzel
Thanks Ben, Russel et al for being so patient with me ;-) To summarize: AIXItl's inefficiencies are so large and the additional benefit it provides is so small that it will likely never be a logical choice over other more efficient, less optimal algorithms. Stefan The additional benefit it

[singularity] Excerpt from a work in progress by Eliezer Yudkowsky

2006-09-15 Thread Ben Goertzel
Subject: Please fwd to Singularity list To: Ben Goertzel [EMAIL PROTECTED] Ben, please forward this to your Singularity list. ** Excerpts from a work in progress follow. ** Imagine that I'm visiting a distant city, and a local friend volunteers to drive me to the airport. I don't know

Re: [singularity] Convincing non-techie skeptics that the Singularity isn't total bunk

2006-09-25 Thread Ben Goertzel
Peter Voss wrote: I have a more fundamental question though: Why in particular would we want to convince people that the Singularity is coming? I see many disadvantages to widely promoting these ideas prematurely. If one's plan is to launch a Singularity quickly, before anyone else notices,

Re: [singularity] i'm new

2006-10-09 Thread Ben Goertzel
Hi, On 10/9/06, Bruce LaDuke [EMAIL PROTECTED] wrote: Just a sidebar on the whole 2012 topic. It's quite possible that singularity is **already here** as new knowledge and that the only barrier is social acceptance. Radical new knowledge is historically created long before it is accepted by

Re: [singularity] Defining the Singularity

2006-10-22 Thread Ben Goertzel
Japan,despitealotofinterestbackin5thGenerationcomputerdaysseemstohaveadifficulttimeinnovatinginadvancedsoftware.Iamnotsurewhy. I talked recently, at an academic conference, with the guy who directs robotics research labs within ATR, the primary Japanese government research lab.He said that at the

Re: [singularity] Defining the Singularity

2006-10-22 Thread Ben Goertzel
Hi, I know you must be frustrated with fund raising, but investor relunctance is understandable from the perspective that for decadesnow there has always been someone who said we're N years from fullblown AI, and then N years passed with nothing but narrow AI progress.Of course, someone will end

Re: [singularity] Defining the Singularity

2006-10-23 Thread Ben Goertzel
Though I have remained often-publiclyopposed to emergence and 'fuzzy' design since first realising what the true consequences (of the heavily enhanced-GA-based system I was workingon at the time) were, as far as I know I haven't made that particularmistake again.Whereas, my view is that it is

Re: Re: [singularity] Defining the Singularity

2006-10-24 Thread Ben Goertzel
Loosemore wrote: The motivational system of some types of AI (the types you would classify as tainted by complexity) can be made so reliable that the likelihood of them becoming unfriendly would be similar to the likelihood of the molecules of an Ideal Gas suddenly deciding to split into

Re: Re: Re: Re: [singularity] Kurzweil vs. de Garis - You Vote!

2006-10-24 Thread Ben Goertzel
Right - for the record when I use words like loony in this sort of context I'm not commenting on how someone might come across face to face (never having met him), nor on what a psychiatrist's report would read (not being a psychiatrist) - I'm using the word in exactly the same way that I would

Re: Re: [singularity] Defining the Singularity

2006-10-26 Thread Ben Goertzel
HI, About hybrid/integrative architecturs, Michael Wilson said: I'd agree that it looks good when you first start attacking the problem. Classic ANNs have some demonstrated competencies, classic symbolic AI has some different demonstrated competencies, as do humans and existing non-AI software.

[singularity] Re: [agi] Motivational Systems that are stable

2006-10-27 Thread Ben Goertzel
... -- Ben G On 10/25/06, Richard Loosemore [EMAIL PROTECTED] wrote: Ben Goertzel wrote: Loosemore wrote: The motivational system of some types of AI (the types you would classify as tainted by complexity) can be made so reliable that the likelihood of them becoming unfriendly would

Re: Re: [singularity] Convincing non-techie skeptics that the Singularity isn't total bunk

2006-10-28 Thread Ben Goertzel
Hi, Do most in the filed believe that only a war can advance technology to the point of singularity-level events? Any opinions would be helpful. My view is that for technologies involving large investment in manufacturing infrastructure, the US military is one very likely source of funds.

Re: Re: [singularity] Re: [agi] Motivational Systems that are stable

2006-10-28 Thread Ben Goertzel
Hi, The problem, Ben, is that your response amounts to I don't see why that would work, but without any details. The problem, Richard, is that you did not give any details as to why you think your proposal will work (in the sense of delivering a system whose Friendliness can be very

Re: [agi] Re: [singularity] Motivational Systems that are stable

2006-10-29 Thread Ben Goertzel
Hi, There is something about the gist of your response that seemed strange to me, but I think I have put my finger on it: I am proposing a general *class* of architectures for an AI-with-motivational-system. I am not saying that this is a specific instance (with all the details nailed down)

[singularity] Fwd: After Life by Simon Funk

2006-10-29 Thread Ben Goertzel
FYI -- Forwarded message -- From: Eliezer S. Yudkowsky [EMAIL PROTECTED] Date: Oct 30, 2006 12:14 AM Subject: After Life by Simon Funk To: [EMAIL PROTECTED] http://interstice.com/~simon/AfterLife/index.html An online novella, with hardcopy purchaseable from Lulu. Theme:

Re: Re: [agi] Re: [singularity] Motivational Systems that are stable

2006-10-30 Thread Ben Goertzel
Hi Richard, Let me go back to start of this dialogue... Ben Goertzel wrote: Loosemore wrote: The motivational system of some types of AI (the types you would classify as tainted by complexity) can be made so reliable that the likelihood of them becoming unfriendly would be similar

[singularity] Goertzel meets Sirius

2006-10-31 Thread Ben Goertzel
Me, interviewed by R.U. Sirius, on AGI, the Singularity, philosophy of mind/emotion/immortality and so forth: http://mondoglobo.net/neofiles/?p=78 Audio only... -- Ben - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to:

[singularity] Ten years to the Singularity ??

2006-12-11 Thread Ben Goertzel
Hi, For anyone who is curious about the talk Ten Years to the Singularity (if we Really Really Try) that I gave at Transvision 2006 last summer, I have finally gotten around to putting the text of the speech online: http://www.goertzel.org/papers/tenyears.htm The video presentation has been

Re: Re: [singularity] Ten years to the Singularity ??

2006-12-11 Thread Ben Goertzel
comes across in the talk. Yours, Joshua 2006/12/11, Ben Goertzel [EMAIL PROTECTED]: Hi, For anyone who is curious about the talk Ten Years to the Singularity (if we Really Really Try) that I gave at Transvision 2006 last summer, I have finally gotten around to putting the text

Re: Re: Re: [singularity] Ten years to the Singularity ??

2006-12-11 Thread Ben Goertzel
not documented or easily digestable, but it seems like one of the most efficient ways to attack the software development problem. Bo On Mon, 11 Dec 2006, Ben Goertzel wrote: ) Hi Joshua, ) ) Thanks for the comments ) ) Indeed, the creation of a thinking machine is not a typical VC type

Re: Re: [singularity] Ten years to the Singularity ??

2006-12-11 Thread Ben Goertzel
or divided by the population size? -Chuck On 12/11/06, Ben Goertzel [EMAIL PROTECTED] wrote: Hi, For anyone who is curious about the talk Ten Years to the Singularity (if we Really Really Try) that I gave at Transvision 2006 last summer, I have finally gotten around to putting the text

Re: Re: Re: [singularity] Ten years to the Singularity ??

2006-12-12 Thread Ben Goertzel
Hi, You mention intermediate steps to AI, but the question is whether these are narrow-AI applications (the bane of AGI projects) or some sort of (incomplete) AGI. According the approach I have charted out (the only one I understand), the true path to AGI does not really involve commercially

Re: Re: [singularity] Ten years to the Singularity ??

2006-12-12 Thread Ben Goertzel
BTW Ben, for the love of God, can you please tell me when your AGI book is coming out? It's been in my Amazon shopping cart for 6 months now! The publisher finally mailed me a copy of the book last week! Ben - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or

Re: Re: [singularity] Ten years to the Singularity ??

2006-12-15 Thread Ben Goertzel
Well, the requirements to **design** an AGI on the high level are much steeper than the requirements to contribute (as part of a team) to the **implementation** (and working out of design details) of AGI. I dare say that anyone with a good knowledge of C++, Linux, and undergraduate computer

Re: Re: Re: [singularity] Ten years to the Singularity ??

2006-12-20 Thread Ben Goertzel
Yes, this is one of the things we are working towards with Novamente. Unfortunately, meeting this low barrier based on a genuine AGI architecture is a lot more work than doing so in a more bogus way based on an architecture without growth potential... ben On 12/20/06, Joshua Fox [EMAIL

[singularity] Storytelling, empathy and AI

2006-12-20 Thread Ben Goertzel
This post is a brief comment on PJ Manney's interesting essay, http://www.pj-manney.com/empathy.html Her point (among others) is that, in humans, storytelling is closely tied with empathy, and is a way of building empathic feelings and relationships. Mirror neurons and other related mechanisms

Re: [singularity] Vinge Goerzel = Uplift Academy's Good Ancestor Principle Workshop 2007

2007-02-19 Thread Ben Goertzel
Joshua Fox wrote: Any comments on this: http://news.com.com/2100-11395_3-6160372.html Google has been mentioned in the context of AGI, simply because they have money, parallel processing power, excellent people, an orientation towards technological innovation, and important narrow AI

Re: [singularity] Scenarios for a simulated universe

2007-03-04 Thread Ben Goertzel
Richard, I long ago proposed a working definition of intelligence as Achieving complex goals in complex environments. I then went through a bunch of trouble to precisely define all the component terms of that definition; you can consult the Appendix to my 2006 book The Hidden Pattern

Re: [singularity] Apology to the list, and a more serious commentary on AIXI

2007-03-09 Thread Ben Goertzel
Alas, that was not quite the question at issue... In the proof of AIXI's ability to solve the IQ test, is AIXI *allowed* to go so far as to simulate most of the functionality of a human brain in order to acquire its ability? I am not asking you to make a judgment call on whether or not it

Re: [singularity] Apology to the list, and a more serious commentary on AIXI

2007-03-09 Thread Ben Goertzel
AIXI is valueless. Well, I agree that AIXI provides zero useful practical guidance to those of us working on practical AGI systems. However, as I clarified in a prior longer post, saying that mathematics is valueless is always a risky proposition. Statements of this nature have been

Re: [singularity] The Extropian Creed by Ben

2008-01-20 Thread Ben Goertzel
your options, please go to: http://v2.listbox.com/member/?; -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] We are on the edge of change comparable to the rise of human life on Earth. -- Vernor Vinge - This list is sponsored by AGIRI

Re: [singularity] The Extropian Creed by Ben

2008-01-20 Thread Ben Goertzel
/?; -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] We are on the edge of change comparable to the rise of human life on Earth. -- Vernor Vinge - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your

Re: [singularity] The Extropian Creed by Ben

2008-01-20 Thread Ben Goertzel
On Jan 20, 2008 1:54 PM, Ben Goertzel [EMAIL PROTECTED] wrote: Hi Natasha After discussions with you and others in 2005, I created a revised version of the essay, which may not address all your complaints, but hopefully addressed some of them. http://www.goertzel.org/Chapter12_aug16_05

Re: [singularity] The Extropian Creed by Ben

2008-01-21 Thread Ben Goertzel
, please go to: http://v2.listbox.com/member/?; -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] We are on the edge of change comparable to the rise of human life on Earth. -- Vernor Vinge - This list is sponsored by AGIRI: http

[singularity] Multi-Multi-....-Multiverse

2008-01-25 Thread Ben Goertzel
that was really refreshing!!!) ben -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] If men cease to believe that they will one day become gods then they will surely become worms. -- Henry Miller - This list is sponsored by AGIRI: http

Re: [singularity] Wrong focus?

2008-01-26 Thread Ben Goertzel
Mike, I certainly would like to see discussion of how species generally may be artificially altered, (including how brains and therefore intelligence may be altered) - and I'm disappointed, more particularly, that Natasha and any other transhumanists haven't put forward some half-way

Re: [singularity] Wrong focus?

2008-01-27 Thread Ben Goertzel
Craig Venter co creating a new genome - Just to be clear: They did not create a new genome, rather they are re-creating a subset of a previously existing one... is an example of the genetic keyboard playing on itself, i.e. one genome [Craig Venter] has played with another genome and will

Re: [singularity] Multi-Multi-....-Multiverse

2008-01-27 Thread Ben Goertzel
On Jan 27, 2008 5:26 PM, Vladimir Nesov [EMAIL PROTECTED] wrote: On Jan 27, 2008 9:29 PM, John K Clark [EMAIL PROTECTED] wrote: Ben Goertzel [EMAIL PROTECTED] we can think about a multi-multiverse, i.e. a collection of multiverses, with a certain probability distribution over them

Re: [singularity] Multi-Multi-....-Multiverse

2008-01-27 Thread Ben Goertzel
Nesov wrote: Exactly. It needs stressing that probability is a tool for decision-making and it has no semantics when no decision enters the picture. ... What's it good for if it can't be used (= advance knowledge)? For other purposes we'd be better off with specially designed random

Re: [singularity] Multi-Multi-....-Multiverse

2008-01-28 Thread Ben Goertzel
) notation. -- Vladimir Nesovmailto:[EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?; -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC

Re: [singularity] Multi-Multi-....-Multiverse

2008-01-29 Thread Ben Goertzel
OK, but why can't they all be dumped in a single 'normal' multiverse? If traveling between them is accommodated by 'decisions', there is a finite number of them for any given time, so it shouldn't pose structural problems. The whacko, speculative SF hypothesis is that lateral movement btw

Re: [singularity] Multi-Multi-....-Multiverse

2008-02-02 Thread Ben Goertzel
current model of the universe is in many ways wrong ... it seems interesting to me to speculate about what a broader, richer, deeper model might look like -- Ben Goertzel (list owner, plus the guy who started this thread ;-) On Feb 2, 2008 3:54 AM, Samantha Atkins [EMAIL PROTECTED] wrote: WTF does

[singularity] Quantum resonance btw DNA strands?

2008-02-05 Thread Ben Goertzel
This article http://www.physorg.com/news120735315.html made me think of Johnjoe McFadden's theory that quantum nonlocality plays a role in protein-folding http://www.surrey.ac.uk/qe/quantumevolution.htm H... ben -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director

Re: Re : Re : Re : [singularity] Quantum resonance btw DNA strands?

2008-02-05 Thread Ben Goertzel
... thanks Ben Goertzel List Owner On Feb 5, 2008 4:36 PM, Bruno Frandemiche [EMAIL PROTECTED] wrote: hello,too me(stop me if you have the thue,i am very open) http://www.spaceandmotion.com/wave-structure-matter-theorists.htm cordialement votre bruno - Message d'origine De : Bruno

Re: Re : Re : Re : Re : [singularity] Quantum resonance btw DNA strands?

2008-02-05 Thread Ben Goertzel
Hi Bruno, effectively,my commentary is very short so excuse-my(i drive my pc with my eyes because i am a a.l.s with tracheo and gastro and i was a speaker,not a writer and it's difficult) Well that is certainly a good reason for your commentaries being short! hello ben ok ,i stop,no

[singularity] Brief report on AGI-08

2008-03-08 Thread Ben Goertzel
-- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] If men cease to believe that they will one day become gods then they will surely become worms. -- Henry Miller --- singularity Archives: http

[singularity] Microsoft Launches Singularity

2008-03-24 Thread Ben Goertzel
http://www.codeplex.com/singularity --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription:

Re: [singularity] Vista/AGI

2008-04-06 Thread Ben Goertzel
/member/?; Powered by Listbox: http://www.listbox.com -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] If men cease to believe that they will one day become gods then they will surely become worms. -- Henry Miller

Re: [singularity] Vista/AGI

2008-04-06 Thread Ben Goertzel
If the concept behind Novamente is truly compelling enough, it should be no problem to make a successful pitch. Eric B. Ramsay Gee ... you mean, I could pitch the idea of funding Novamente to people with money?? I never thought of that!! Thanks for the advice ;-pp Evidently, the concept

Re: [singularity] Vista/AGI

2008-04-06 Thread Ben Goertzel
On Sun, Apr 6, 2008 at 12:21 PM, Eric B. Ramsay [EMAIL PROTECTED] wrote: Ben: I may be mistaken, but it seems to me that AGI today in 2008 is in the air again after 50 years. Yes You are not trying to present a completely novel and unheard of idea and with today's crowd of sophisticated

Re: [singularity] Vista/AGI

2008-04-06 Thread Ben Goertzel
On Sun, Apr 6, 2008 at 4:42 PM, Derek Zahn [EMAIL PROTECTED] wrote: I would think an investor would want a believable specific answer to the following question: When and how will I get my money back? It can be uncertain (risk is part of the game), but you can't just wave your hands

Re: [singularity] Vista/AGI

2008-04-06 Thread Ben Goertzel
/member/?; Powered by Listbox: http://www.listbox.com -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] If men cease to believe that they will one day become gods then they will surely become worms. -- Henry Miller

Re: [singularity] Vista/AGI

2008-04-08 Thread Ben Goertzel
/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member/?; Powered by Listbox: http://www.listbox.com -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director of Research, SIAI [EMAIL PROTECTED] If men cease to believe that they will one day become gods

Re: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Ben Goertzel
/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member/?; Powered by Listbox: http://www.listbox.com singularity | Archives | Modify Your Subscription -- Ben

Re: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Ben Goertzel
Of course what I imagine emerging from the Internet bears little resemblance to Novamente. It is simply too big to invest in directly, but it will present many opportunities. But the emergence of superhuman AGI's like a Novamente may eventually become, will both dramatically alter the

Re: [singularity] Vista/AGI

2008-04-13 Thread Ben Goertzel
Samantha, You know, I am getting pretty tired of hearing this poor mouth crap. This is not that huge a sum to raise or get financed. Hell, there are some very futuristic rich geeks who could finance this single-handed and would not really care that much whether they could somehow monetize

Re: [singularity] Vista/AGI

2008-04-13 Thread Ben Goertzel
I don't think any reasonable person in AI or AGI will claim any of these have been solved. They may want to claim their method has promise, but not that it has actually solved any of them. Yes -- it is true, we have not created a human-level AGI yet. No serious researcher disagrees. So why

Re: [singularity] Vista/AGI

2008-04-13 Thread Ben Goertzel
Hi, Just my personal opinion...but it appears that the exponential technology growth chart, which is used in many of the briefings, does not include AI/AGI. It is processing centric. When you include AI/AGI the exponential technology curve flattens out in the coming years (5-7) and becomes

Re: [singularity] Vista/AGI

2008-04-14 Thread Ben Goertzel
Brain-scan accuracy is a very crude proxy for understanding of brain function; yet a much better proxy than anything existing for the case of AGI... On Sun, Apr 13, 2008 at 11:37 PM, Richard Loosemore [EMAIL PROTECTED] wrote: Ben Goertzel wrote: Hi, Just my personal opinion