SPOILERS

On Mon, 16 Jul 2001, Erik Reuter wrote:

> > 3.  The movie has some technical and logical flaws.
> 
> I'd say that is an understatement. I went in expecting sci-fi, and I was
> very disappointed. If I had expected a fairy-tale, I think I would have
> had a much better time.

I went in expecting a science fictional sort of movie that would deal with
the psychological and philosophical implications (for humans and robots)
of making a robot with genuine emotions.  I felt quite satisfied.  I did
not expect technical perfection on all details--no SF movie ever seems to
have that--but I did expect the Pinnochio angle, since every preview for
the movie I have seen mentioned it (besides, it's sort of implicit in the
setup:  "his love is real, but he is not").
 
> Here are a few of the things that I question. Some of them don't stand
> very well on their own but rather fit in with my impression of the
> general level of technology (for example, Star Trek has amazing sensors
> and tractor-beams and many sci-fi stories gloss over power sources for
> things like robots, but with such amazing technology in AI why did none
> of the humans survive?):

I don't know, and I'll grant that this is one of the movie's great
faults--you really can't extinguish the human race without giving some
kind of reason why, without setting up this eventuality as a likely
possibility, and expect people to blithely accept that as a plot point.

> 1) They have been making robots for 50 years and this is the first time anyone
> tried very hard to make a robot with real emotions?

Well, um, we've been making computers for 50 years and haven't yet made
one with real emotions....one presumes that it's quite technically
difficult.  (And why would you want to?  Slaves with real emotions are a 
pain in the ass.)  On top of which, there's the problem of understanding
emotions well enough to program for them in the first place (Ding! Ding!
Philosophical issue alert!). :-)

> 2) There is no good reason for the imprinting being irreversible. An
> adequate hardware and software protected reset command should be easily
> doable.

There is if, as Prof. Hobby suggests at the scene in Manhattan, their
goal is not merely to create simulacrum emotions but to create a robot
that's truly self-motivated, with a rich inner life--that has a "self" as 
opposed to merely a sophisticated program.  It's reasonable to suppose
that a sufficiently complicated artificial "brain," like a real one,
wouldn't have a convenient reset switch.

> 3) Anything the robot puts in its mouth mucks up the inner
> workings? Nobody thought of a bag attached to the throat? (also, how can
> he go underwater without wetting his circuit boards?)

Yes, this is kind of a silly contradiction.  But not a fatal one in my
eyes.

> 4) No discussion of safeguards built into the robots against hurting
> people or property damage?

>From comparing David to Gigolo Joe, I gather that Joe-class robots
probably can't hurt people, at least not intentionally.  David seems to be
a different order of machine.  Have you read Rudy Rucker's robot novels?
It posits that robots can be very intelligent but hemmed in by Asimovian
laws...until a maverick scientist teaches the bots how to circumvent that
programming, at which point they become fully autonomous.  I think
something similar is happening with David in A.I.

> 5) An ice age came on so quickly? And no humans survived?

Yeah, the lack of explanation here is a serious fault.

> 6) Sea-level rose in New York by 300 feet?

Environmentalist hyperbole.  But it makes for great special effects.

> 7) How could the police helicopter's "magnet" thingie levitate/attract
> Gigolo Joe from a distance of 100 feet without affecting David or any of
> the other metal around? Is this some sort of "tractor-beam", a la Star
> Trek?

It's a staple of movie sci-fi.  Not very rigorous, and spearing Joe with a
rope and harpoon would have been much more dramatic, but this strikes me
as pretty trivial nitpicking.

> 8) David's power lasted for about 1000 years? (maybe he went into some
> low power mode? still, even hundreds of years is hard to imagine)

True.  Maybe the superbots had jumper cables. :-)

> 9) None of the humans attempted to download their personalities and
> knowledge into a robot (seeing as how some robots survived?)

The ability to build a robot doesn't imply the ability to extract intact
human personalities from human brains.

> 10) The super-advanced robots can pull information out of the very fabric
> of space time, and create selective memories (David's "mother" didn't
> recall everything), but they can't create a human or robot with those
> memories that can last for more than a day?

This was the thing that bugged me the most--the pseudo-scientific
explanation in this case was as inelegant, if not more so, than Qui-Gon
Jin's "midichlorian" speech.  

I'm working on a longer essay about how I feel about this movie, but I
think I'll end up putting it on a web page.  I suspect it'll make an
annoyingly long post.

Marvin Long
Austin, Texas

Nuke the straight capitalist wildebeests for Buddha!

Reply via email to