I posted this to the Culture earlier but thought people here on Brin-L might like to think about it too:
Suppose we have artificially intelligent and sentient programs. Somebody sets one of these up in a closed virtual environment in which it will be tortured to death in the slowest and most horrific manner imaginable. I think we'd all agree that running that program is morally wrong because it would inflict suffering on a sentient being. However, computers are deterministic, which means that all the future states are inherent in the combination of the program and the initial state. Does that mean that creating the program and initial state is itself morally wrong even if nobody ever runs the program? If not, what does running the program do that's so special that it becomes morally wrong to run it but not to create it? Is it just that it allows us to view the horrific nature of the consequences that are latent in and fully determined by the initial state? Does that mean that suffering and hence morality are in some manner relative to external observers? What, then, about the situation in which the computer running the program is in a sealed box and is utterly destroyed when the program terminates so no external observer can see those consequences? Would running it then be okay? An extension of this thought experiment might make things even more confusing. The issue is essentially that there's a "Platonic" sense in which the whole future of the program exists because it's encoded in the present. Indeed, it's possible to make a reversible program in which the whole past of the program exists in the same way. But if we're going to take that view, then the initial state of the program exists in the same (or a similar) Platonic way anyway. How, then, does writing it down make any difference? (Or how does running it.) Along the lines of Searle's Chinese Room we could imagine someone very clever writing down the initial state with a pen on paper. It seems hard to believe that writing it down like that is morally wrong. However, if my argument is valid, then not just writing it down but also writing it down and then burning it immediately are morally wrong. Rich GCU I Don't Know What I Think On This Issue
