9. a particular AGI theoryThat is, one that convinces me it's on the right
track.
Now that you have run this poll, what did you learn from the responses and how
are you using this information in your effort?
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe
Hey but it makes for an excellent quote. Facts don't have to be true if they're
beautiful or funny! ;-)
Sorry Eliezer, but the more famous you become, the more these types of
apocryphal facts will surface... most not even vaguely true... You should be
proud and happy! To quote Mr Bean 'Well, I
Hm. Memory may be tricking me.
I did a deeper scan of my mind, and found that the only memory I
actually have is that someone at the conference said that they saw I
wasn't in the room that morning, and then looked around to see if
there was a bomb.
I have no memory of the fire thing one
This absolutely never happened. I absolutely do not say such things, even
as a joke
Your recollection is *very* different from mine. My recollection is
that you certainly did say it as a joke but that I was *rather* surprised
that you would say such a thing even as a joke. If anyone
# 7 8 9
Money is good, but the overall AGI theory and program plan is the most
important aspect.
James Ratcliff
YKY (Yan King Yin) [EMAIL PROTECTED] wrote: Can people rate the following
things?
1. quick $$, ie salary
2. long-term $$, ie shares in a successful corp
3. freedom to do what
I did a deeper scan of my mind, and found that the only memory I actually
have is that someone at the conference said that they saw I wasn't in the
room that morning, and then looked around to see if there was a bomb.
My memory probably was incorrect in terms of substituting fire for bomb
Synergy or win-win between my work and the project i.e. if the project
dovetails with what I am doing (or has a better approach). This would require
some overlap between the project's architecture and mine. This would also
require a clear vision and explicit 'clues' about deliverables/modules
provided that I
thought they weren't just going to take my code and apply some licence
which meant I could no longer use it in the future..
I suspect that I wasn't clear about this . . . . You can always take what is
truly your code and do anything you want with it . . . . The problems
start
but I'm not very convinced that the singularity *will* automatically happen.
{IMHO I think the nature of intelligence implies it is not amenable to
simple linear scaling - likely not even log-linear
I share that guess/semi-informed opinion; however, while that means that I
am less
Mark waser writes:
P.S. You missed the time where Eliezer said at Ben's
AGI conference that he would sneak out the door before
warning others that the room was on fire:-)
You people making public progress toward AGI are very brave indeed! I wonder
if a time will come when the
On 04/06/07, Derek Zahn [EMAIL PROTECTED] wrote:
I wonder if a time will come when the personal security of AGI researchers or
conferences will be a real concern. Stopping AGI could be a high priority
for existential-risk wingnuts.
I think this is the view put forward by Hugo De Garis. I
On 6/5/07, Bob Mottram [EMAIL PROTECTED] wrote:
I think this is the view put forward by Hugo De Garis. I used to
regard his views as little more than an amusing sci-fi plot, but more
recently I am slowly coming around to the view that there could emerge
a rift between those who want to build
Mark Waser wrote:
P.S. You missed the time where Eliezer said at Ben's AGI conference
that he would sneak out the door before warning others that the room was
on fire:-)
This absolutely never happened. I absolutely do not say such things,
even as a joke, because I understand the
Clues. Plural.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
The most important thing by far is having an AGI design that seems
feasible.
Only after that (very difficult) requirement is met, do any of the others
matter.
-- Ben G
On 6/3/07, YKY (Yan King Yin) [EMAIL PROTECTED] wrote:
Can people rate the following things?
1. quick $$, ie salary
2.
important -- 6 which would necessarily include 8 and 9
potentially important -- 10 (average level is a poor gauge, if there are
sufficient highly-expert/superstar people you can afford an equal number of
relatively non-expert people, if you don't have any real superstars, you're
dead in the
My reasons for joining a2i2 could only be expressed as subportions of
6 and 8 (possibly 9 and 4).
I joined largely on the strength of my impression of Peter. My
interest in employment was to work as closely as possible on general
artificial intelligence, and he wanted me to work for him on
--- YKY (Yan King Yin) [EMAIL PROTECTED] wrote:
Can people rate the following things?
1. quick $$, ie salary
2. long-term $$, ie shares in a successful corp
3. freedom to do what they want
4. fairness
5. friendly friends
6. the project looks like a winner overall
7. knowing that the
18 matches
Mail list logo