On 6/13/07, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:
I wouldn't bother working with anyone who was seriously worried over
"who got the credit" for building a Singularity-class AI - no other
kind matters.  There are two reasons for this, not just the obvious one.

Come on, there're no "obvious" reasons for this complex issue.  And
"Singularity-grade" is not a well-defined term.

What I'm proposing is a way for *us* to collaborate and to bring about
progress faster.  I admit that my scheme is not perfect, but it seems that
no one else is proposing a better one.

If you're not proposing a better scheme for collaboration, and you criticize
my scheme in a non-constructive way, then effectively you're just saying
that you're not interested in collaborating at all.  And that's kind of sad,
given that we're still so far from AGI.

To put it another way, I'm not going to be the "boss" under this system of
collaboration.  I'm just going to be just another contributor among many.
This is a scheme where we can work together without someone dominating
anyone else *a priori*.

That said, let me add that your non-profit model is also feasible -- as
charity and altruism will always be welcomed by everyone.  But the problem
is that not everyone is highly altruistic.  To organize average people to
work together you have to give rewards.

YKY

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to