Don,

I think we agree on the basic issues.

The difference is one of emphasis.  Because I believe AGI can be so very
powerful -- starting in a perhaps only five years if the right people got
serious funding -- I place much more emphasis on trying to stay way ahead
of the curve with regard to avoiding the very real dangers its very great
power could bring.

Edward W. Porter
Porter & Associates
24 String Bridge S12
Exeter, NH 03833
(617) 494-1722
Fax (617) 494-1822
[EMAIL PROTECTED]



-----Original Message-----
From: Don Detrich [mailto:[EMAIL PROTECTED]
Sent: Sunday, September 30, 2007 1:12 PM
To: [email protected]
Subject: RE: [agi] Religion-free technical content



First, let me say I think this is an interesting and healthy discussion
and has enough "technical" ramifications to qualify for inclusion on this
list.



Second, let me clarify that I am not proposing that the dangers of AGI be
"swiped under the rug" or that we should be "misleading" the public.



>>I just think we're a long way from having real
data to base such discussions on, which means if held at the moment
they'll inevitably be based on wild flights of fancy.<<



We have no idea what the "personality" of AGI will be like. I believe it
will be VERY different from humans. This goes back to my post "Will AGI
like Led Zeppelin?" To which my answer is, probably not. Will AGI want to
knock me over the head to take my sandwich or steal my woman? No, because
it won't have the same kind of biological imperative that humans have.
AGI, it's a whole different animal. We have to wait and see what kind of
animal it will be.



>>By that point, there will be years of time to consider its wisdom and
hopefully apply some sort of friendliness theory to an actually dangerous
stage. <<



Now, you can feel morally at ease to promote AGI to the public and go out
and get some money for your research.



As an aside, let me make a few comments about my point of view. I was half
owner of an IT staffing and solutions company for ten years. I was the
sales manager and a big part of my job was to act as the translator
between the technology guys and the client decision makers, who usually
were NOT technology people. They were business people with a problem
looking for ROI. I have been told by technology people before that
concentrating on "what the hell we actually want to accomplish here" is
not an important technical issue. I believe it is. "What the hell we
actually want to accomplish here" is to develop AGI. Offering a REALISTIC
evaluation of the possible advantages and disadvantages of the technology
is very much a technical issue. What we are currently discussing is, what
ARE the realistic dangers of AGI and how does that effect our development
and investment strategy. That is both a technical and a strategic issue.





Don Detrich





  _____

This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?
<http://v2.listbox.com/member/?&;
> &

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=48325024-2cff63

Reply via email to