Bob Mottram wrote:
This made me chuckle:

"So after the first safe AI is built, the situation will stabilize
completely and any further change will always occur in a controlled
way that is consistent with the original design."

Okay, O Wicked-Tongued One, how would you have summarized an extremely complex set of factors and ensuing sequence of events in one tiny little sentence? ;-)

Each one of these one-paragraph summaries is statement of the conclusion of an enormous essay. Some of those essays I have written, but only published on lists, but eventually I will get them all out in blog form.

But having said all that, the above quote is an exact statement of something that I have justified on this list in some detail: did you read that previous argument? I can see nothing in that quote that is inaccurate, or that cannot be justified rather completely.




Richard Loosemore


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=98558129-0bdb63
Powered by Listbox: http://www.listbox.com

Reply via email to