Ya know... all this "unfriendly AI" stuff reminds me of when Drexler handed
me a xerox of his initial thoughts on nanotech at a space development
conference in the early 80s.  It launched right away into "the grey goo"
dangers.  Driggers, O'Leary and O'Neill had just walked back "The High
Frontier" in their paper "New Routes To Manufacturing In Space" because it
was obvious that NASA had come in over budget on LEO cost/lb by a factor of
a thousand or so.  The Space Studies Institute had been working on the
lunar mass driver assuming they could get space settlement bootstrapped
with the earlier estimates but now they had to go to partially self
replicating lunar systems with large-latency teleoperation to get the
bootstrap mass down by a factor of a thousand.  Drexler, a big Feynman fan,
had taken that idea to the extreme with "nanotech".  But, see he had a
problem:  No one but the L5 Society scifi fanboys were buying  the idea.
"Grey goo" was Drexler's way of getting people to "think past the sale" --
focus the "conversation" on "What in the world are  we going to do to
contain the THREAT of nanotech?"  In so doing, any cognitive psychologist
would tell you that he was getting people to reify belief in the viability
of nanotech without ever addressing its reality.  This worked well enough
that I actually had guys in San Diego trying to talk me out of working on
legislation to privatize launch services and focus on working with nanotech
CAD tools to get the space settlement  bootstrap mass low enough that it
could all be done in one Shuttle flight.

The "unfriendly AI" stuff is sort of like that, but it's outlived its
usefulness as means of getting investment.  All it seems destined to do now
is support the "algorithmic bias" activists.  ABAs  are profoundly
uninterested in identifying priors that are least-biased, such as the size
prior of Solomonoff, the speed prior of Schmidhuber, etc.  No, what the
ABAs are interested in is pointing and screeching at behavior by AIs that
they consider "bad" and then engaging in political extortion to get kludges
inserted into the data and code to stifle it.  This is likely to have the
same effect on the AIs that it has on people:  Make them stupid.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tcc38461f5863ddd2-M5dcb9ffe530ad6c0a14cfa0c
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to