I'm sorry, what is AGI again?
Andrew Yost, PhD
Forest Ecologist
Oregon Dept. of Forestry
Salem, OR 97310
503-945-7410
_
From: Derek Zahn [mailto:[EMAIL PROTECTED]
Sent: Saturday, September 22, 2007 11:02 AM
To: singularity@v2.listbox.com
Subject: [singularity] Benefits of being a kook
Near the beginning of this discussion, reference is made to the generally poor
mainstream press response to the recent Singularity conference. Does anyone
have link(s) to any general press reviews, journalistic coverage, etc? I
haven't had much luck in locating any and would like to get a
Who cares? Really, who does? You can't create an AGI that is friendly or
unfriendly. It's like having a friendly or unfriendly baby.How do you
prevent the next Hitler, the next Saddam, the next Osama, and so on and so
forth? A friendly society is a good start. Evil doesn't evolve in the
On 9/24/07, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
Near the beginning of this discussion, reference is made to the generally
poor mainstream press response to the recent Singularity conference. Does
anyone have link(s) to any general press reviews, journalistic coverage,
etc? I haven't
Artificial Stupidity,
You're making a lot of wild assumptions here, and I am not sure what you
are driving at. You can't compare good or bad machines with good or
bad humans. Do you have a serious, genuine interest in AGI? I would
hope so, or why would you be here? It takes awhile to
Artificial Stupidity wrote:
Who cares? Really, who does? You can't create an AGI that is friendly
or unfriendly. It's like having a friendly or unfriendly baby.How
do you prevent the next Hitler, the next Saddam, the next Osama, and so
on and so forth? A friendly society is a good
See
http://www.topix.net/content/ap/2007/09/techies-ponder-computers-smarter-than-us-4.
It's from the Associated Press, so it's written once
and then copy-pasted to news sources all over the
world.
- Tom
--- [EMAIL PROTECTED] wrote:
Near the beginning of this discussion, reference is
made to
If I understand the singularity correctly it will result in humans
having greatly expanded and enhanced information processing capabilities
(Human + Machine). Assuming these new capabilities will ultimately
result in higher levels of consciousness, then, can we say that the
singularity will
--- Artificial Stupidity [EMAIL PROTECTED] wrote:
Who cares? Really, who does? You can't create an
AGI that is friendly or
unfriendly. It's like having a friendly or
unfriendly baby.
No, it is not. A baby comes pre-designed by evolution
and genetics. An AGI can be custom-written to spec.
Tom McCabe [EMAIL PROTECTED] wrote:
See
http://www.topix.net/content/ap/2007/09/techies-ponder-computers-smarter-than-us-4.
It's from the Associated Press, so it's written once
and then copy-pasted to news sources all over the
world.
This discussion reminds me of a couple videos I saw, not
10 matches
Mail list logo