Matt Grimaldi wrote:

<snip>
> "G. D. Akin" wrote:
> >
> > The only instance I can think of off the top of my
> > head is Giskard's development of the "Zeroth Law"
> (snip)
> >
> > Are there other instances I've forgotten?
> >
> > George A
> >
>
>
> What about the story where they installed
> a new control robot in an orbital power station,
> who then decided that humans were not the
> builders of robots (rather, they were
> co-creations of some greater being, to which
> the satellite was sending power), and that
> the technicians sent to install the robot were
> in no circumstances to be allowed near the
> controls of the station.  The technicians
> decided in the end that the situation was
> OK because they weren't being threatened
> by the robot directly (it had to protect the
> creator's minions as per the 3 laws), and the
> robot kept the power beam on target during
> a bad solar storm, and had no signs of
> malfunction other than it's misguided
> conclusions.
>
> I also remember a (very) short story where
> the Dr. Calvin had to kill one of
> the early robots because it had found
> a way around its 3-laws programming.

You got me there.  It has been a long, long time since I read Asimov's early
Robot stuff--I just don't remember.

I just finished James Gunn's "Isaac Asimov: The Foundations of Science
Fiction (Revised edition)" and even in it there's not much about robots
going bad.  Most of it is focused on the "mystery" and solving it in the
robot stories, especially the three very good novels.  Mr. Gunn's book is
well worth the read.

George A



_______________________________________________
http://www.mccmedia.com/mailman/listinfo/brin-l

Reply via email to