What do you mean by risk? and by deployment? 
Everyone has a different understanding of AGI.  From my perspective I'm 
attempting to develop an AGI similar to a human infant.  So "deployment" in a 
family setting would perhaps be best.Where the embodied AGI lives, is raised 
by,  and participates in a human family. 
Thoughts? 

Date: Tue, 15 Oct 2013 02:04:54 -0700
Subject: [agi] Least risky AGI deployment strategy?
From: [email protected]
To: [email protected]

What are people's opinions on how to develop and deploy AGI in a way that 
encourages or increases the likelihood of outcomes favorable to current forms 
of terrestrial/organic life and its evolutionary trajectory into future viable 
forms (however rapid that may be)?

It seems to me that if AGI grows within the context of work in extending human 
expression, communication, perception, cognitive behavioral skills etc, that it 
is likely to put us at the forefront of what happens. In this scenario, by the 
time autonomous superhumans emerge we are on fairly common terms and stand a 
chance to co-evolve beneficially.

However (ignoring for a second the fact that no one has demonstrated anything 
close to it yet) an autonomous AGI "person" in 2013 getting pushed out to 
github would be more of a wildcard, and possibly an existential risk.

What are your thoughts? This topic may be a dead horse to beat on, but at the 
same time I doubt there is anyone who is actively hacking on the mysteries of 
subjectivity, cognition, and life who doesn't think about this sometimes. 
Another reason to keep pressing this is that probably some of the resistance to 
illuminating the mysteries of natural computation is due to fear of the unknown 
more or less, so the more confident we can be in strategies for success, the 
more support this sort of work will enjoy.

Rob



  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to