Travis Lenting wrote:
> I don't like the idea of enhancing human intelligence before the singularity.

The singularity is a point of infinite collective knowledge, and therefore 
infinite unpredictability. Everything has to happen before the singularity 
because there is no after.

> I think crime has to be made impossible even for an enhanced humans first. 

That is easy. Eliminate all laws.

> I would like to see the singularity enabling AI to be as least like a 
> reproduction machine as possible.

Is there a difference between enhancing our intelligence by uploading and 
creating killer robots? Think about it.

> Does it really need to be a general AI to cause a singularity? Can it not 
> just stick to scientific data and quantify human uncertainty?  It seems like 
> it would be less likely to ever care about killing all humans so it can rule 
> the galaxy or that its an omnipotent servant.   

Assume we succeed. People want to be happy. Depending on how our minds are 
implemented, it's either a matter of rewiring our neurons or rewriting our 
software. Is that better than a gray goo accident?

 -- Matt Mahoney, [email protected]




________________________________
From: Travis Lenting <[email protected]>
To: agi <[email protected]>
Sent: Sun, June 27, 2010 5:21:24 PM
Subject: Re: [agi] Questions for an AGI

I don't like the idea of enhancing human intelligence before the singularity. I 
think crime has to be made impossible even for an enhanced humans first. I 
think life is too adapt to abusing opportunities if possible. I would like to 
see the singularity enabling AI to be as least like a reproduction machine as 
possible. Does it really need to be a general AI to cause a singularity? Can it 
not just stick to scientific data and quantify human uncertainty?  It seems 
like it would be less likely to ever care about killing all humans so it can 
rule the galaxy or that its an omnipotent servant.    


On Sun, Jun 27, 2010 at 11:39 AM, The Wizard <[email protected]> wrote:

This is wishful thinking. Wishful thinking is dangerous. How about instead of 
hoping that AGI won't destroy the world, you study the problem and come up with 
a safe design.
>
>
>
>Agreed on this dangerous thought! 
>
>
>On Sun, Jun 27, 2010 at 1:13 PM, Matt Mahoney <[email protected]> wrote:
>
>>>
>>This is wishful thinking. Wishful thinking is dangerous. How about instead of 
>>hoping that AGI won't destroy the world, you study the problem and come up 
>>with a safe design.
>>
>> -- Matt Mahoney, [email protected]
>>>>
>>
>>
>>
>>
________________________________
 >>From: rob levy <[email protected]>
>>To: agi <[email protected]>
>>Sent: Sat, June 26, 2010 1:14:22 PM
>>Subject: Re: [agi]
>> Questions for an AGI
>>
>>
>>>>>why should AGIs give a damn about us?
>>>>>
>>>
>>>
>>I like to think that they will give a damn because humans have a unique way 
>>of experiencing reality and there is no reason to not take advantage of that 
>>precious opportunity to create astonishment or bliss. If anything is 
>>important in the universe, its insuring positive experiences for all areas in 
>>which it is conscious, I think it will realize that. And with the resources 
>>available in the solar system alone, I don't think we will be much of a 
>>burden. 
>>
>>
>>I like that idea.  Another reason might be that we won't crack the problem of 
>>autonomous general intelligence, but the singularity will proceed regardless 
>>as a symbiotic relationship between life and AI.  That would be beneficial to 
>>us as a form of intelligence expansion, and beneficial to the artificial 
>>entity a way of being alive and having an experience of the world.  
>>>>
>>agi | Archives  >> | Modify >> Your Subscription  
>>>>
>>agi | Archives  >> | Modify >> Your Subscription  
>
>
>
>-- 
>Carlos A Mejia
>
>Taking life one singularity at a time.
>www.Transalchemy.com  
>
>>
>agi | Archives  > | Modify > Your Subscription  

agi | Archives  | Modify Your Subscription  


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=8660244-6e7fb59c
Powered by Listbox: http://www.listbox.com

Reply via email to