Krimel said to Ian:
You are right I must be missing your point. If you are saying that "life" or
"intelligence" can arise "naturally" out of printed circuits then I don't
think we are even using the same language. When you say intelligence is not
inherent in biological systems or that genes produce brains but not
intelligence this just seems to be adding subtlety at the expense of
intelligibility.

Matt:
To intercede in a conversation I haven't been following closely at all, I
think Ian's point is that the idea behind the natural/artificial distinction
may be misplaced when talking about the idea of robots someday having
minds/consciousness like humans.  As a pragmatist, I think Ian's stance is
that the mind/consciousness evolved naturally out of biological evolution,
that cultural evolution is predicated on biological, that whatever the mind
is, it is basically what happens when biological processes get really,
really complex.  For pragmatists, traveling up what used to be called the
Great Chain of Being, or up Pirsig's static levels, is at root a continuum
of complexity.

[Krimel]
I think the distinction between natural and artificial is pretty straight
forward. Artificial means specifically man made. It is the product of human
contrivance. Artificial includes everything from stone tools to iPods; from
goat's milk to Tang; from bear skulls to televangelists.

The artificial is not "unnatural." It is a subset of the natural. Of course
there are grey areas along this continuum but I would say they are along to
lines of whether hives and nests and beaver dams are artificial. Some might
argue that pigs and goats and wheat are not artificial. But I don't think
the HAL9000 could be considered any less artificial that the Space Shuttle. 


Matt:
I think the example that is in point is Asimov's story that got made into
the Will Smith movie, I, Robot.  At that level of robotic complexity, we--as
viewers in addition to the characters--have trouble knowing whether we
should treat them as "one of us," i.e. whether moral/legal categories apply
to them and how.  _This_ is the pertinent question--not how they came to be.
The natural/artificial distinction becomes outmoded.

[Krimel]
Asimov's robots are engineered to obey the four laws. Asimov is talking
about how we would intelligently design another mind. However we elect to
design another mind, the process will be a matter of designing artifacts. 

Medieval Jews had their golem and Oz never did give nothing to the Tinman
that he didn't, didn't already own... From the Shellyesque replicants of
Blade Runner to George Jetson's maid Rosie the relationship between man and
machine is well represented in the modern mythos.

The issue of how we regard our own artifacts morally and ethically is
especially highlighted in Blade Runner when a replicant begs Harrison Ford
for its life. Ford winds up falling in love with Sean Young. Smith's
relationship to Sonny is a bit more subtle. I agree these are the
interesting issues. The question of how is about as interesting as the
question of when. And I notice the question of 'if' has sort of faded away. 


Matt:
Besides, I think Ian might also be playing at breaking down the distinction
along the lines of, "When did our activities cease to be natural?"  One can
cry foul for common sense, but as a philosophical point, I have some
sympathy because of our Enlightenment philosophical heritage, which treats
"natural" as a moral category of approbation, and hence Will Smith's
difficulty in treating robots morally (ya' know, feeling remorse for
shooting them in the head and such).

[Krimel]
I don't think our activities ever cease to be "natural" but it is still
possible to identify distinctly human artifacts as artificial.



Moq_Discuss mailing list
Listinfo, Unsubscribing etc.
http://lists.moqtalk.org/listinfo.cgi/moq_discuss-moqtalk.org
Archives:
http://lists.moqtalk.org/pipermail/moq_discuss-moqtalk.org/
http://moq.org.uk/pipermail/moq_discuss_archive/

Reply via email to