On 8/30/2012 4:51 PM, Craig Weinberg wrote:


On Thursday, August 30, 2012 7:38:27 PM UTC-4, Brent wrote:

    On 8/30/2012 2:19 PM, Craig Weinberg wrote:


    Computational analogies can only provide us with a toy model of morality.  
Should I
    eat my children, or should I order a pizza? It depends on the anticipation 
of
    statistical probabilities, etc...no different than how the equilibrium of 
oxygen
    and CO2 in my blood determines whether I inhale or exhale.

It also depends on what you want. No decision problem can be solved with values. The values that evolved biologically are common and don't change very fast; so it's
    a good bet you love your children more than yourself.


It's a good bet to metabolize carbohydrates before protein too, but that doesn't imply that a moral dimension could or would be conjured out of nowhere to somehow assist that decision.



    This kind of modeling may indeed offer some predictive strategies and 
instrumental
    knowledge of morality, but if we had to build a person or a universe based 
on this
    description, what would we get? Where is the revulsion, disgust, and blame 
- the
    stigma and shaming...the deep and violent prejudices? Surely they are not 
found in
    the banal evils of game theory.

    They're found in your the banal neurons of your brain,


Not necessarily. All that we find in neurons so far is molecules. No sign of any disgust. We have only our own word to take for the fact that such a thing as disgust even exists. TV shows aren't in the banal pixels of a TV screen. The internet isn't in my computer.

so they could be part of the morals of a robot if we chose to build it that way. From our perspective as citizens in a very diverse and interconnected world of
    billions of people, we can see ways in which we might give a robot better, 
more
    adaptive, values than biology has given us.

    Brent


If morals didn't exist, why would we choose to invent them? What possible purpose could be served by some additional qualitative layer of experience on top of the perfectly efficient and simple execution of neurochemical scripts? Don't you see that the proposed usefulness of such a thing is only conceivable in hindsight - after the fact of its existence?

We didn't invent them. They evolved. Evolution has no foresight, it's random. It takes advantage of what is available. Feeling sick at your stomach after eating rotten food is a good adaptation to teach you not eat stuff like that again. So what feeling would work to guide you not harm a child? - how about that 'sick at your stomach' feeling.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to