Oops, I thought we were having fun, but it looks like I have offended
somebody, again. I plead guilty for being somewhat off the purely technical
discussion topic, but I thought "Edward W. Porter" and I were having a
pretty interesting discussion.  However it seems my primary transgression is
focusing on AGI as a business opportunity rather than a coming apocalypse. 

 

>>AGI is a real danger, any campaigns to promote the development of AGI
while specificially ignoring discussion about the potential implications are
dangerously
irresponsible. helping foster an atmosphere that could lead to humanity's
demise.<<

 

So, let's look at this from a technical point of view. AGI has the potential
of becoming a very powerful technology and misused or out of control could
possibly be dangerous. However, at this point we have little idea of how
these kinds of potential dangers may become manifest. AGI may or may not
want to take over the world or harm humanity. We may or may not find some
effective way of limiting its power to do harm. AGI may or may not even
work. At this point there is no AGI. Give me one concrete technical example
where AGI is currently a threat to humanity or anything else. 

 

I do not see how at this time promoting investment in AGI research is
"dangerously irresponsible" or "fosters an atmosphere that could lead to
humanity's demise". It us up to the researchers to devise a safe way of
implementing this technology not the public or the investors. The public and
the investors DO want to know that researchers are aware of these potential
dangers and are working on ways to mitigate them, but it serves nobodies
interest to dwell on dangers we as yet know little about and therefore can't
control. Besides, it's a stupid way to promote the AGI industry or get
investment to further responsible research. 

 

Let me tell you what IS a danger to humanity; global warming, nuclear
weapons, the demise of oil, greed, corruption, poverty, medieval religions
and paranoia of technology. Now that really IS scary. I fear the current
direction of humanity a hell of allot more than I fear any future AGI. If we
are lucky, and I mean REALLY lucky, AGI and other technologies will arrive
just in time save our ass.

 

Don Detrich 

 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=48177027-39e5c3

Reply via email to