I read both papers (Aaronson's and his students) and I'll be eigendamned
if I believe they really said much about morality.
On the other hand, it was in interesting extension of the Prisoner's
Dilemma, too bad they didn't throw Nick, et al's MOTH (my way or the
highway) strategy in...
I was *hoping* that they were going to implement *networks* of
prisoner-bots who were not necessarily iterating with the same player
every time, but rather iterating with a larger social network (of
networks) with the possibility of seeing enclaves of cooperation emerge,
maybe see how the boundaries of such enclaves survive (I can imagine
something like Tit-for-Tat variation creating a lipid-like membrane
around much sweeter up to even AlwaysCooperate groups)...
But I still don't think it speaks quite to morality?
Anyone else read this?
For all of ye that are skeptical of technology's ability to guide our
morality, read the link below.
Soon, we will all worship at the Church of Google, singing the praises
of EigenJesus!
http://www.scottaaronson.com/blog/?p=1820
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com