a wrote:
Are you trying to make an "intelligent" program or want to launch a singularity? I think you are trying to do the former, not the latter. I think you do not have a plan and are "thinking out loud". Chatting in this list is equivalent to "thinking out loud". Think it all out first, before chatting. I will not chat in this list anymore. If you want to launch a singularity, then do practical. Simply do vision/spatial.

I'm working on thoughts for how such a program should be written. I haven't started seriously writing, or settled on a design, but I'm trying to create a design. I don't expect to succeed, but the "payoff" if I do would be that the AI that got created was one that I thought well of. I don't want to attempt to control what it does, but rather what it wants to do. (I.e., the goal structure.)

My hypothesis is that if the AI wants to do something it will eventually figure out how to do it. If it wants to avoid doing something, it will figure out how to avoid it. So what you need to do is create a goal system which is powerful, safe, and efficient. Ideally it should be an ESS (evolutionarily stable system), but I don't think I could prove that of any feasible real system.

As to a singularity.... I think we've already crossed the "Schwartzschild boundary" analog. We couldn't give up technology without 90% of humanity dying, and the world won't support the current population with the current technology, so we've got to keep pushing the technology forwards. Even a zero-population-growth scenario wouldn't make the current state stable. We might be able to stabilize things if each couple could only have one child per lifetime for a few generations...but the system wouldn't be able to maintain itself long enough for that to bring the world down to carrying capacity. So we end up needing BOTH technology AND population control. (TV is an excellent population control device. Where TV is introduced, populations tend to stabilize...at least if there is decent programming. But it doesn't suffice.) So we're committed.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=53477015-68c27c

Reply via email to