Re: [agi] The Grounding of Maths

2007-10-12 Thread Eliezer S. Yudkowsky
thinking about such things... Just visualize it in N-dimensional space, then let N go to 8. -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence - This list is sponsored by AGIRI: http://www.agiri.org/email

Re: Self-improvement is not a special case (was Re: [agi] Religion-free technical content)

2007-10-12 Thread Eliezer S. Yudkowsky
of yourself, but *either one* involves the sort of issues I've been calling reflective. -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence - This list is sponsored by AGIRI: http://www.agiri.org/email

Re: [agi] Nirvana? Manyana? Never!

2007-11-01 Thread Eliezer S. Yudkowsky
in which you would most want to live? -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http

Re: [agi] Nirvana? Manyana? Never!

2007-11-01 Thread Eliezer S. Yudkowsky
to be, wants to aspire to? Forget, for the moment, what you think is possible - if you could have anything you wanted, is this the end you would wish for yourself, more than anything else? -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute

Re: [agi] Nirvana? Manyana? Never!

2007-11-02 Thread Eliezer S. Yudkowsky
anime. -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member

Re: [agi] Nirvana? Manyana? Never!

2007-11-02 Thread Eliezer S. Yudkowsky
theists, there was an atheist police officer, signed up for cryonics, who ran into the World Trade Center and died on September 11th. As Tyrone Pow once observed, for an atheist to sacrifice their life is a very profound gesture. -- Eliezer S. Yudkowsky http://singinst.org

Re: [agi] Questions

2007-11-05 Thread Eliezer S. Yudkowsky
Monika Krishan wrote: 2. Would it be a worthwhile exercise to explore what Human General Intelligence, in it's present state, is capable of ? Nah. -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence

[agi] Re: What best evidence for fast AI?

2007-11-10 Thread Eliezer S. Yudkowsky
- let alone that we can predict it in practice with knowledge presently available to us. -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence - This list is sponsored by AGIRI: http://www.agiri.org/email

[agi] OpenMind, MindPixel founders both commit suicide

2008-01-19 Thread Eliezer S. Yudkowsky
http://www.wired.com/techbiz/people/magazine/16-02/ff_aimystery?currentPage=all I guess the moral here is Stay away from attempts to hand-program a database of common-sense assertions. -- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-20 Thread Eliezer S. Yudkowsky
many great mathematicians and achievers that lived to old age. I dare not say whether it is dangerous to be a genius without access to more complete statistics. -- Kai-Mikael Jää-Aro - http://www.nada.kth.se/~kai/lectures/geb.html -- Eliezer S. Yudkowsky

<    1   2