On Aug 19, 2007, at 12:26 PM, Matt Mahoney wrote:

I was never really a Singularity activist, but

1. I realized the singularity is coming and nothing can stop it.

Not so. Humanity could so harm its technological base as to postpone Singularity on this planet for quite some time. We could still bomb ourselves back into the Stone Age. We could do a Nehemiah Scudder thing in the US and slow ourselves down for at least another century and perhaps toss around some nukes to boot. The race toward stupidity may overtake our best efforts. The push to control and monitor everything may get a huge shot in the arm by the next real or contrived terrorist attack and we may lose the freedom necessary to the work as a result. I haven't even touched on natural disasters.

2. The more I study the friendly AI problem, the more I realize it is
intractable.

Largely agreed.

3. Studying the singularity raises issues (e.g. does consciousness exist?)
that conflict with hardcoded beliefs that are essential for survival.

Huh?  Are you conscious?

4. The vast majority of people do not understand the issues anyway.

So?  Isn't that the way it always is with great advances?

See my answers below.



--- Joshua Fox <[EMAIL PROTECTED]> wrote:

This is the wrong place to ask this question, but I can't think of anywhere
better:

There are people who used to be active in blogging, writing to the email lists, donating money, public speaking, or holding organizational positions in Singularitarian and related fields -- and are no longer anywhere near as
active. I'd very much like to know why.

Possible answers might include:

1. I still believe in the truthfulness and moral value of the
Singularitarian position, but...
a. ... eventually we all grow up and need to focus on career rather than
activism.

I never considered it something that required a strong appeal to the public at large. I also do think that expecting the Singularity to solve all our problems to the point of focusing only on it is a very illogical tact for all but a few researchers working on it. It is the latest pie in the sky it will all be utter perfection by and by. There is something that feels more than a bit juvenile in much of the attitude of many of us.

b. ... I just plain ran out of energy and interest.
c. ... public outreach is of no value or even dangerous; what counts is the
research work of a few small teams.

Mainly I agree with this.

d. ... why write on this when I'll just be repeating what's been said so
often.

Too much time is wasted with repetition of the same old questions and ideas. I am on way too many email lists and have too many interests for my own good.

e. ... my donations are meaningless compared to what a dot-com millionaire
can give.
2. I came to realize the deep logical (or: moral) flaws in the
Singularitarian position. [Please tell us they are.]

A position that says we should be in a great hurry to get to a state of affairs that we cannot remotely understand or control and where we will be nearly totally at the mercy of an incomprehensible and utterly alien intelligence at least deserves serious questioning now and again.

3. I came to understand that Singularitarianism has some logical and moral validity, but no more than many other important causes to which I give my
time and money.


I am 53 years old and have too little net worth. I have much to do to get my own house in order. I give to a few causes like life extension. Most of the AGI groups that I believe have most traction are not that easy to donate to. I don't believe at this point that the Singularity Institute is likely to produce a working AGI. Many things it does do are interesting and I would consider donating to it for those reasons. But I think FAI is a vast distraction from much needed AGI.

- samantha

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=33525574-f889c7

Reply via email to