Hi Ben,

I think that true machine intelligence will be computationally
demanding and will initially appear on expensive hardware
available only to wealthy institutions like the government or
corporations. Even when it is possible on commodity hardware,
expensive hardware will still support much greater intelligence.

I also think it is not realistic to imagine a small group
creating machine intelligence in total secrecy.

The right approach is to educate the public as widely and loudly
as possible about the nature and dangers of machine intelligence.
In the modern world, wide public education seems to be the best
way to resist public dangers.

As you said in another message, "this is going to be a very
difficult issue as time goes on". But I think military applications
present a good opportunity for public education, since people
already accept the idea that biological, chemical and nuclear
weapons should not be used.

Cheers,
Bill

On Mon, 2 Dec 2002, Ben Goertzel wrote:

>
>
> Regarding being wary about military apps of AI technology, it seems to me
> there are two different objectives one might pursue:
>
> 1) to avoid militarization of one's technology
>
> 2) to avoid the military achieving *exclusive* control over one's technology
>
> It seems to me that the first objective is very hard, regardless of whether
> one accepts military funding or not.  The only ways that I can think of to
> achieve 1) would be
>
> 1a) total secrecy in one's project all the way
>
> 1b) extremely rapid ascendancy from proto-AGI to superhuman AGI -- i.e.
> reach the end goal before the military notices one's project.  This relies
> on "security through simply being ignored" up to the proto-AGI phase...
>
> On the other hand, the second objective seems to me relatively easy.  If one
> publishes one's work and involves a wide variety of developers in it, no one
> is going to achieve exclusive power to create AGI.  AGI is not like nuclear
> weapons, at least not if a software-on-commodity-hardware approach works (as
> I think it well).  Commodity hardware only is required, programming skills
> are common, and math/cog-sci skills are not all *that* rare...
>
> -- Ben G
>
>
>
>
>
> > -----Original Message-----
> > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> > Behalf Of Alexander E. Richter
> > Sent: Monday, December 02, 2002 7:48 AM
> > To: [EMAIL PROTECTED]
> > Subject: RE: [agi] An idea for promoting AI development.
> >
> >
> > At 07:18 02.12.02 -0500, Ben wrote:
> > >....
> > >Can one use military funding for early-stage AGI work and then somehow
> > >delimitarize one's research once it reaches a certain point?
> > One can try,
> > >but will one succeed?
> >
> > They will squeeze you out, like Lillian Reynolds and Michael Brace in
> > BRAINSTORM (1983) (Christopher Walken, Natalie Wood)
> >
> > cu Alex
> >
> > -------
> > To unsubscribe, change your address, or temporarily deactivate
> > your subscription,
> > please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
> >
>
> -------
> To unsubscribe, change your address, or temporarily deactivate your subscription,
> please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
>

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to