On 8/20/07, Samantha Atkins <[EMAIL PROTECTED]> wrote: >> [...] the deep logical (or: moral) flaws in the >> Singularitarian position. [Please tell us they are.] > > A position that says we should be in a great hurry to get to a state > of affairs that we cannot remotely understand or control and where we > will be nearly totally at the mercy of an incomprehensible and > utterly alien intelligence at least deserves serious questioning now > and again.
The Singularitarian position does not say we should be in a great hurry, *unless* there are extremely threatening external existential risks, that can't be avoided in any better way as by hurrying towards the Singularity, which hurry in itself is an existential risk. See e.g. http://www.nickbostrom.com/astronomical/waste.html I would very much like a situation, in which we could take so much time and care to get to the Singularity, that I would die long before then. And I do think we might indeed have a bit more time than many think. (I'm not counting on it, though.) > Most of the AGI groups that I believe have most traction > are not that easy to donate to. I don't believe at this point that > the Singularity Institute is likely to produce a working AGI. Many > things it does do are interesting and I would consider donating to it > for those reasons. But I think FAI is a vast distraction from much > needed AGI. Have you noticed the stuff SIAI is about to fund now, with Ben Goertzel as Director of Research? It's not just FAI. -- Aleksei Riikonen - http://www.iki.fi/aleksei ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604&id_secret=33560965-3d79df
