On Wed, 18 Jul 2001, Erik Reuter wrote:

> > I can program my computer to tell me it loves me, but somehow that
> > wouldn't be nearly as satisfying as having my wife tell me the same thing.
> > Let me ask this:  at the beginning of the movie, when Prof. Hobby asks
> > the female robot what love is, and she answers with a series of nothing
> > but physiological responses (rapid breathing, etc.)--is she right?
> 
> Her answer only covered the physiological aspect.

Exactly.  If we ignore the internal aspect of emotions--the part you sort
of have to take on faith that other people have--then all you have left
are the physiological aspects.  If we accurately program something to
mimic those physiological aspects--which includes what it says and
does--can we automatically conclude that we've made something that
actually feels and cares about that fact?  For safety's sake we might want
to assume yes, but I don't think such an assumption is necessarily correct
at all.
 
> Cute. 90% of "science fiction" IS bad. One quality of
> good science fiction is that it posits technologies, or more broadly,
> a technological environment, that is plausible. Sometimes it works,
> often they fail to create a plausible technological environment. Some
> of my favorite SF does not merely extrapolate current technology, but
> to do that believably is difficult! Most fail. It is easier, a higher
> probability play, to extrapolate current technology.

I bow to the superior whoopassedness of Sturgeon's Law.  :-)  I'd quibble
over what I think is a matter of taste, though, and assert that most SF is
bad not because of how it handles tech, but because of its failure to
handle people well.  Call me a softie if you will, but I'm willing to
grant some license with technology if the characters are handled well.
I'm more interested in how the nuts and bolts are going to affect people
than I am in the nuts and bolts themselves.
 
> > More to the point:  a David-style artificial mind is extrapolated from
> > two things, not one--modern computers, which can be rebooted, and the
> > human brain, which cannot.  If the hypothetical imaginary technology falls
> > a little more on one side than the other, in a matter that is essentially
> > moot because either alternative destroys the personality in question, I
> > can't see why that should make much difference in terms of plausibility.
> 
> So you wouldn't mind if you had to throw away your computer rather then
> sell or give it away when you decided you didn't want it anymore?

That's beside the point (especially to the computer, if either way
its personality is erased).  I just think that it's plausible to suppose
that in the development of advanced ai, we might resort to a technology
that resembles the brain more than it does a pentium III.  Maybe
once David is "imprinted" his mind is designed to start rewiring itself,
so to speak, in order to adapt to its new home, in much the way a brain
physically reorganizes itself as it grows and learns.  Given sufficient
complexity, I think it perfectly plausible that such a machine would have
no reset function, or that adding a reset function might simply be less
cost effective than chucking the old "brain" and popping in a fresh one.

> 
> Ever heard of time capsules? Or of parents who wish their children to
> follow in their footsteps? How about auto-biographies? Memorials?

Sure, and I suppose some people might use robots as recording devices, but
on the whole I'd expect robots to be a very inefficient way of doing these
things compared to other alternatives.  If I know I'm going to die and I
have a limited amount of time and, presumably, money, I'm more likely to
want to record my thoughts than my mannerisms.  That's just me, though.
 
> A SF theme is not enough to make good SF. Implementation is critical.
> The setting, the technological environment, must be consistent and
> plausible. That is a large part of what makes it science fiction and not
> just fiction.

I agree, but obviously I don't think A.I. fares as badly in this
department as you do.  Uh...sorry 'bout that!  :-)

Marvin Long
Austin, Texas

Nuke the straight capitalist wildebeests for Buddha!

Reply via email to