--- On Thu, 9/18/08, Trent Waddington <[EMAIL PROTECTED]> wrote:

> On Fri, Sep 19, 2008 at 7:54 AM, Matt Mahoney
> <[EMAIL PROTECTED]> wrote:

> >  Perhaps there are some applications I haven't
> thought of?
> 
> Bahahaha.. Gee, ya think?

So perhaps you could name some applications of AGI that don't fall into the 
categories of (1) doing work or (2) augmenting your brain?

A third one occurred to me: launching a self improving or evolving AGI to 
consume all available resources, i.e. an intelligent worm or self replicating 
nanobots. This really isn't a useful application, but I'm sure somebody, 
somewhere, might think it would be really cool to see if it would launch a 
singularity and/or wipe out all DNA based life.

Oh, I'm sure the first person to try it would take precautions like inserting a 
self destruct mechanism that activates after some number of replications. (The 
1988 Morris worm had software intended to slow its spread, but it had a bug). 
Or maybe they will be like the scientists who believed that the idea of a chain 
reaction in U-235 was preposterous...
(Thankfully, the scientists who actually built the first atomic pile took some 
precautions, such as standing by with an axe to cut a rope suspending a cadmium 
control rod in case things got out of hand. They got lucky because of an 
unanticipated phenomena in which a small number of nuclei had delayed fission, 
which made the chain reaction much easier to control).


-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to