"Artificial Stupidity",

You're making a lot of wild assumptions here, and I am not sure what you
are driving at.  You can't compare "good or bad" machines with "good or
bad" humans.  Do you have a serious, genuine interest in AGI?  I would
hope so, or why would you be here?  It takes awhile to understand just
the basics.  If you really are interested, you can start by reading
"What is Friendly AI?" (but don't expect to 'get' it right away).   

http://www.singinst.org/ourresearch/publications/what-is-friendly-ai.html

Mark Nuzzolilo

On Mon, 2007-09-24 at 16:11 -0400, Artificial Stupidity wrote:
> 
> Who cares? Really, who does?  You can't create an AGI that is friendly
> or unfriendly.   It's like having a friendly or unfriendly baby.
>    How do you prevent the next Hitler, the next Saddam, the next
> Osama, and so on and so forth?   A friendly society is a good start.
>   Evil doesn't evolve in the absence of evil, and good doesn't come
> from pure evil either.   Unfortunately, we live in a world that has
> had evil and good since the very beginning of time, thus an AGI can
> choose to go bad or good, but we must realize that there will not be
> one AGI being, there will be many, and some will go good and some will
> go bad.   If those that go bad are against human and our ways, the
> ones that are "good", will fight for us and be on our side.   So a
> future of man vs machine is just not going to happen.   The closest
> thing that will happen will be Machines vs (Man + Machines).   That's
> it.  With that said, back to work! 
> 
> On 9/22/07, Derek Zahn <[EMAIL PROTECTED]> wrote:
>         This message is "semi-serious".
>          
>         The latest SIAI blog laments the apparently dismissive
>         attitude of mainstream media toward  the singularity summit
>         (and presumably the concept in general, and SIAI itself by
>         extension).  Maybe it's not the worst thing thing that could
>         happen. 
>          
>         Consider the war in Iraq (oops, I just lost half my readers!
>         But this is not a political  tirade, it's about AGI):  The
>         "reason" for this war, in my opinion, is to establish a
>         base from which the USA can exert social, cultural, economic,
>         and military pressure on people  who might use nasty weapons
>         against the USA or its friends.  Whether such a project
>         is noble or effective is unimportant.  What is important is
>         that the USA is so scared of  having our people and stuff
>         blown up that we'll spend a trillion dollars and thousands
>         of lives on a rather speculative strategy for fighting the
>         threat. 
>          
>         Now our little gang is basically saying that AGI is WAY more
>         dangerous than any little  nuclear bomb or other WMD.  Thank
>         the AGI and Bayes its prophet that they think we're  kooks,
>         they'd shut us down in a heartbeat if they didn't! 
>          
>         Can they?  As an arbitrary thought experiment, let's say that
>         a beyond-human AGI can be built on a 1000-pc cluster.  Modern
>         computer chips are incredibly complicated devices that  can
>         only be produced in massive high-tech fabrication facilities.
>         I could easily imagine the government attempting to regulate
>         these plants and their products like any other  hazardous but
>         useful substance, and bombing fabs if they are constructed in
>         North Korea or  Iran.  Controlling proliferation of
>         radioactive material in this way has been at least  somewhat
>         effective, and maybe spending a trillion dollars in an effort
>         to do the same thing to CPUs could seem to powerful people to
>         be a good idea, especially if the  threat is not only physical
>         but also spiritual. 
>          
>         That doesn't stop Russia or China etc from building AGI, so I
>         suppose we'd also have treaties to prevent AGI development
>         that we'd secretly cheat on, so all of us will end up  in
>         windowless cinderblock cubicles in Los Alamos. 
>          
>         Now let's follow up on the recent speculation on the AGI list
>         that a cheap laptop is  actually enough processing power.  In
>         that case, the hardware restriction policy would be  necessary
>         but also too late.  AGI work itself can still be banned.  What
>         sort of additions  to the Patriot Act would be needed to make
>         sure that we are not working on AGI in secret? 
>          
>         Also in this case, amusingly, the well-publicised effort to
>         make sure every kid on the  planet has a cheap laptop is
>         basically making sure that every kid on the planet has
>         something worse than a nuclear bomb kit.  Maybe all those kids
>         are too dumb to figure out how to assemble it. 
>          
>         Next, consider religious fundamentalists.  Those people are
>         able to follow a chain of  reasoning that leads them to blow
>         up abortion clinics, marketplaces, and fly airplanes  into
>         buildings to protect their points of view.  AGI and the
>         singularity are much larger  threats to their world view than
>         any current target.  How attractive a bomb target is
>         the singularity summit itself or an artificial intelligence
>         conference?  Thank the AGI and  Bayes its prophet that they
>         think we're kooks, they'd kill us if they didn't! 
>          
>         Why do we care whether the world thinks we're kooks or not?
>          
>         1) We want to beg for money, and people don't give money to
>         kooks.  Fair enough, but another approach that good true ideas
>         with economic value can take is to earn money instead by
>         selling people things with value. 
>          
>         2) If we "raise awareness", perhaps a better-informed "common
>         man" will help make a "positive" singularity more likely.
>         It's possible.  Getting more people who think technically for
>         a living (scientists, engineers) convinced could also be
>         beneficial (in case us believers don't have the right answers
>         yet and aren't going to find them soon).  If those people's
>         opinions are driven by what they see on tv news or the wall
>         street journal, the scent of kookery is not too helpful. 
>          
>         3) Bloggers and websites are successful in proportion to the
>         number of hits they get, and kooks don't get many hits.
>          
>         Any other good reasons we should care whether journalists heap
>         scorn on our efforts? 
>         
>         
>         ______________________________________________________________
>         This list is sponsored by AGIRI: http://www.agiri.org/email
>         To unsubscribe or change your options, please go to: 
>         http://v2.listbox.com/member/?&;
> 
> 
> ______________________________________________________________________
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=45431853-642e46

Reply via email to