It is true that eventually this technology will be in the public domain and be available to DARPA.
The important thing is to avoid DARPA getting it before everyone else does. The ***only*** way to do this is to avoid accepting funding from them. If this means that it takes 5 more years to develop, then so be it. If it means that you have to flip burgers by day, and code by night, then so be it. If someone makes a deal with the devil, they are only going to receive a bad result. Some people want to delude themselves that they are doing something "good", but their real motives may lie $$$elsewhere$$$. (not referring to the Novamente team). Peace, Kevin ----- Original Message ----- From: "Ben Goertzel" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Monday, December 02, 2002 9:09 AM Subject: RE: [agi] An idea for promoting AI development. > > > Regarding being wary about military apps of AI technology, it seems to me > there are two different objectives one might pursue: > > 1) to avoid militarization of one's technology > > 2) to avoid the military achieving *exclusive* control over one's technology > > It seems to me that the first objective is very hard, regardless of whether > one accepts military funding or not. The only ways that I can think of to > achieve 1) would be > > 1a) total secrecy in one's project all the way > > 1b) extremely rapid ascendancy from proto-AGI to superhuman AGI -- i.e. > reach the end goal before the military notices one's project. This relies > on "security through simply being ignored" up to the proto-AGI phase... > > On the other hand, the second objective seems to me relatively easy. If one > publishes one's work and involves a wide variety of developers in it, no one > is going to achieve exclusive power to create AGI. AGI is not like nuclear > weapons, at least not if a software-on-commodity-hardware approach works (as > I think it well). Commodity hardware only is required, programming skills > are common, and math/cog-sci skills are not all *that* rare... > > -- Ben G > > > > > > > -----Original Message----- > > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On > > Behalf Of Alexander E. Richter > > Sent: Monday, December 02, 2002 7:48 AM > > To: [EMAIL PROTECTED] > > Subject: RE: [agi] An idea for promoting AI development. > > > > > > At 07:18 02.12.02 -0500, Ben wrote: > > >.... > > >Can one use military funding for early-stage AGI work and then somehow > > >delimitarize one's research once it reaches a certain point? > > One can try, > > >but will one succeed? > > > > They will squeeze you out, like Lillian Reynolds and Michael Brace in > > BRAINSTORM (1983) (Christopher Walken, Natalie Wood) > > > > cu Alex > > > > ------- > > To unsubscribe, change your address, or temporarily deactivate > > your subscription, > > please go to http://v2.listbox.com/member/?[EMAIL PROTECTED] > > > > ------- > To unsubscribe, change your address, or temporarily deactivate your subscription, > please go to http://v2.listbox.com/member/?[EMAIL PROTECTED] > ------- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
