Re the military - nice thinking - like your Trojan horse idea. But more 
seriously: no I don't think I would take money from DARPA (however, if they 
rock up at my doorstep with a no-strings attached cheque... who knows... my 
principles are most definitely not for sale... unless the price is *really 
really good* ;-)
 
Re unFriendly - I sure hope that 'my' AGI will have much higher ethical 
standards than I do (which is not particularly difficult would my friends say 
:). It is just a small way in which I can accumulate some good karma. Tho yes 
perhaps I can't/shouldn't be trusted any more than anyone else on this list. 
Especially not Eleizer who actually has very malicious intentions but has the 
genius of pretending to do the exact opposite so as to be on the inside :) (I'm 
really just kidding of course!) More seriously: the day AGi arrives will be the 
day some people will try to use it for malicious purposes - like pretty much 
any other software product. Usually the good guys win. If it's a very smart 
AGI, I don't think we need to worry much but I'm not very convinced that the 
singularity *will* automatically happen. {IMHO I think the nature of 
intelligence implies it is not amenable to simple linear scaling - likely not 
even log-linear. But that's just my contrary view.:} So we need to take care 
only in the early stages. I don't think you can avoid it by trying to keep it 
secret - OSS has proved the opposite and is our best hope I think (so very 
humbly, of course) - the good people on this planet outnumber the evil ones ten 
to one or more.
 
=Jean-Paul
 
 
Department of Information Systems
Email: [EMAIL PROTECTED]
Phone: (+27)-(0)21-6504256
Fax: (+27)-(0)21-6502280
Office: Leslie Commerce 4.21


>>> "Mark Waser" <[EMAIL PROTECTED]> 2007/06/04 14:52:09 >>>
>> Except I would not work for a company that would aim to retain (exclusive 
>> or long-term) commercial rights to AGI design (and thus become rulers of 
>> the world :)

Which is a lot of the thought process behind my suggestion.  I don't see a 
Friendly AGI as allowing such a situation to exist and if the AGI 
effectively owns the company . . . .

>> nor would I accept funding from any source that aims to adopt AGI 
>> research outcomes for military purposes.

My sense of humor says --> Oh, I don't know about that . . . .  It might be 
a lot of fun to give the military a Friendly AGI -- since it would probably 
be much more like giving the military to the Friendly AGI.    :-)

But the grim reality is that I wouldn't be sure that I could successfully 
avoid giving them something that I don't want to, sooooo . . . .

>>  I would have to be able to use these (code, idea) *completely* freely as 
>> I would deem fit

But what if you had unFriendly intentions?


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to