> Extra credit:
> I've just read the Crichton novel PREY. Totally transparent movie-scipt but
> a perfect text book on how to screw up really badly. Basically the formula
> is 'let the military finance it'. The general public will see this
> inevitable movie and we we will be drawn towards the moral battle we are
> creating.
> 
> In early times it was the 'tribe over the hill' we feared. Communication has
> killed that. Now we have the 'tribe from another planet' and the 'tribe from
> the future' to fear and our fears play out just as powerfully as any time in
> out history.

Note: I'm not arguing for or against AI here, just bringing to light some personal 
observations


This particular situation is different than the others you describe(tribe over the 
hill).  To accept the dangers of AI, one must first swallow racial pride and admit 
that we are not the top-dogs in the universe.  Few people are willing to do this, even 
among well-educated, science minded engineers.  I just tested this topic on my group 
of internet friends in a private forum with 20 some people.  I was unable to convince 
a single person that this danger is real with a day's worth of intensive back and 
forth discussion.  They assumed the typical "we can just control it" mentality that 
has always been prevalent.  Notice that even in gloomy bad-AI stories such as 
Terminator and the Matrix, the humans always win in the end.  This is what the 
mainstream will believe becauses they want to believe it.  

In other words, I don't think the public is going to care one-iota about the dangers 
of AI.  They'd prefer to focus their energy on banning truly harmless technologies, 
such as cloning.  People fear clones because as far as they are concerned, clones are 
people too, so we're dealing with an equal, and can lose.  But AI's are just 
"machines", they can be "out-smarted" or "out-evolved"  as far as the average person 
is concerned.  

The upside is that AI researchers won't have to fight to keep their research legal.

The downside of this is that we're more likely to destroy ourselves. 


-Brad

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to