On Tue, Jul 17, 2001 at 10:46:00AM -0500, Marvin Long, Jr. wrote: > SPOILER WARNING > > On Tue, 17 Jul 2001, Erik Reuter wrote: > > Obviously they've made robots that can act like they love you--the new > challenge, both technical and moral, is in creating robots that really > do. They are one in the same. Its not such a big challenge, given what else they can do. > David's trickier, since the point is not just to create a robot boy > that can act like he loves his mommy, but to create a robot boy that > really does, and which has all the vulnerabilities such dependence > implies. Joe can't be hurt if a customer changes her mind about > having sex with him. David *can* be hurt if Monica refuses to love > him (unless we grant this premise, there's no story). You are assuming there is something almost magical about emotions and love. I believe if we can ever do the intelligence part, the emotions will be a snap. Intelligence is hard; emotions are not. > The difference between these two states of being is a titanic chasm, > in my opinion, and to blithely suppose that achieving David's state is > just a matter of the machinery itself strikes me as absurd. It's like > the difference between software and soul. But this is not science. As I said before, it is a fairy tale, not science fiction. In case it is not clear, I totally disagree with your supposition that there is something special about emotions. I'm sure it would be easier to create an emotion simulation program than an intelligence simulation program. > The truth is that we simply have no reason to assume this. Of course we do. That is the technology that we have now, and I am extrapolating. Any other assumption relies on some totally new technology, which, while possible, must be less likely than to simply improve on current technology. > To use the obvious example, an Asimovian positronic brain is nothing > like a modern computer. That was fiction. For plausibility, it is better to extrapolate from current SCIENCE, not from FICTION. > If the writer wants to posit a whole new kind of hardware, he can, > IMO. Sure. But if it is implausible enough, I don't consider it science fiction. > Spielberg could, of course, have posited that the boy can be reset, > and then written a plot in which the boy struggles not to be reset > because it's equivalent to death--imagine if somebody reset your > personality so that you had no memory of life since day one--but he > didn't. Being reset or being destroyed and replaced with another > robot makes no difference in this context except one: being destroyed > is more viscerally horrifying than being reset, but the difference to > the personality being erased isn't significant. That certainly would have made it a better science fiction story. > Knowledge, yes: just write it down. As for personality, why should > we believe this? What about the creation of robots implies that they > should also be able to "download" a personality from a living brain? It does not have to be downloading (meaning some sort of hardware<->brain connection). There are many other possibilities. One simple example would be to have the robot follow you around and learn to mimic you. This could be combined with direct programming of important traits and knowledge. As I said in my original post, the technology was just all out of whack. If they could make such sophisticated robots, with the intelligence that they had, and for them to be so life-like and long-lived, then they should have also been able to create a pretty good simulacrum of a given person. Since it probably wouldn't feel to the person who was simulated that they had any continuity with the robot, it would not be any form of immortality. But I think there would be plenty of people who would try it, nevertheless. I have different standards for science fiction and for fairy-tales (fantasy). I wouldn't be complaining if this movie were not billed as science fiction. -- "Erik Reuter" <[EMAIL PROTECTED]> http://www.erikreuter.com/
