On 8/30/2012 2:19 PM, Craig Weinberg wrote:
On Thursday, August 30, 2012 4:47:19 PM UTC-4, Alberto G.Corona wrote:
There is a human nature, and therefore a social nature with invariants.
in computational terms, the human mind is a collection or hardwired
programs.
codified by a developmental program, codified itself by a genetic program,
which
incidentally is a 90% identical in all humans (this is an amazing
homogeneity for a
single specie).
These hardwired programs create behaviours in humans, that interact in a
social
environment. By game theory, you can verify that there are Nash
equilibriums among
these human players. These optimums of well being for all withing the
constraints of
human nature called nash equilibriums are the moral code.
In general they are not Nash equilibra. Evolution doesn't settle on Nash equilibra
because in many cases they are unstable for finitely repeated games, c.f. Ginitis "Bounds
of Reason".
These equilibriums are no sharp maximums, but vary slightly according with
the
social coordinates. They are lines of surface maximums. These maximums are
know by
our intuition because we have suffered social selection, so a knowledge of
them are
intuitive. That we have suffered social selection means that the groups of
hominids
or the individual hominids whose conducts were away from the nash
equilibriums
dissapeared. To be near these equilibriums was an advantage so we have
these
hardwired intuitions, that the greeks called Nous and the chistians call
soul.
That doesn't seem like something individual that will survive dissolution of
the body.
What happens a broad variety of moral behaviours are really the expression
of the
same moral code operating in different circunstances where the optimum has
been
displaced. There are very interesting studies, for example in foundational
book of
evolutionary psychology "The adapted mind"
http://en.wikipedia.org/wiki/The_Adapted_Mind
<http://en.wikipedia.org/wiki/The_Adapted_Mind>
about in which circunstances a mother may abandon his newborn child in
extreme cases
(In the study about pregnancy sickness). This would be at the extreme of
the social
spectrum: In the contrary in a affluent society close to ours, the rules
are quite
"normal". Both the normal behaviour or the extreme behaviour is created by
the same
basic algoritm of individual/social optimization. No matter if we see this
from a
dynamic way (contemplating the variations and extremes) or a static one
contemplating a "normal" society, the moral is a unique, universal rule
system.
Thanks to the research on evolution applied to huumans, computer science
and game
theory, It is a rediscovered fact of human nature and his society, that
await a
development of evolutionary morals
I don't think biological evolution has been nearly fast enough to give us hardwired ethics
suited to modern industrial nation states. That's why diverse cultures have evolved;
Different ways of trying to satisfy the moral instincts that evolved for life in a small
tribe. In theory the interaction of these cultures would eventually pick a winner
(cultural selection), but in practice technology and other factors (e.g. global warming,
oil depletion, war) may change things on a much shorter time scale.
Computational analogies can only provide us with a toy model of morality. Should I eat
my children, or should I order a pizza? It depends on the anticipation of statistical
probabilities, etc...no different than how the equilibrium of oxygen and CO2 in my blood
determines whether I inhale or exhale.
It also depends on what you want. No decision problem can be solved with values. The
values that evolved biologically are common and don't change very fast; so it's a good bet
you love your children more than yourself.
This kind of modeling may indeed offer some predictive strategies and instrumental
knowledge of morality, but if we had to build a person or a universe based on this
description, what would we get? Where is the revulsion, disgust, and blame - the stigma
and shaming...the deep and violent prejudices? Surely they are not found in the banal
evils of game theory.
They're found in your the banal neurons of your brain, so they could be part of the morals
of a robot if we chose to build it that way. From our perspective as citizens in a very
diverse and interconnected world of billions of people, we can see ways in which we might
give a robot better, more adaptive, values than biology has given us.
Brent
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.