On Thu, Apr 24, 2008 at 3:16 AM, Samantha Atkins [EMAIL PROTECTED] wrote:
Thomas McCabe wrote:
Does NASA have a coherent response to the moon hoax theory?
This is completely uncalled for. No particular theory of AGI at this time
disserves to be compared to the moon hoax conspiracy
Thomas,
The argument I presented is *not* a restatement of Rice's theorem,
because the concept of the size of a scientific theory is not something
that maps onto the parallel concept of the size of a functions,
algorithm or program.
In order to map theory-size onto algorithm-size it would
On Sat, Apr 26, 2008 at 7:48 AM, Thomas McCabe [EMAIL PROTECTED]
wrote:
On Thu, Apr 24, 2008 at 3:16 AM, Samantha Atkins [EMAIL PROTECTED] wrote:
Thomas McCabe wrote:
Popularity is irrelevant.
Popularity, of course, is not the ultimate judge of accuracy. But are
you seriously claiming
Rolf Nelson wrote:
what is this distinction you are making between logistics and direct
solutions? Especially given that there is much debate about how to
implement friendliness.
By logistics, I mean trying to get talented and motivated people
working on the problem in ways that match their
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
Such a discussion list would be just another exclusive club for people
dedicated to spineless Yudkowsky-worship.
Richard Loosemore
---
singularity
Archives:
Thomas McCabe wrote:
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
Such a discussion list would be just another exclusive club for people
dedicated to spineless Yudkowsky-worship.
Richard Loosemore
Eli's not a member of fai-logistics, and I don't think he even knows
about it yet.
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
Thomas McCabe wrote:
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
Such a discussion list would be just another exclusive club for people
dedicated to spineless Yudkowsky-worship.
Richard Loosemore
Eli's
Thomas McCabe wrote:
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
Thomas McCabe wrote:
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
Such a discussion list would be just another exclusive club for people
dedicated to spineless Yudkowsky-worship.
Richard Loosemore
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
You repeatedly insinuate, in your comments above, that the idea is not
taken seriously by anyone, in spite of the fact I have already made it quite
clear that this is false.
Richard Loosemore
Thomas McCabe wrote:
On 4/18/08, Richard Loosemore [EMAIL PROTECTED] wrote:
You repeatedly insinuate, in your comments above, that the idea is not
taken seriously by anyone, in spite of the fact I have already made it quite
clear that this is false.
The burden of proof is on you to show
I request to be admitted to the proposed list for purely journalistic and
cultural observation reasons.
My background in AI is limited to some expert systems work in the late 80's and
also three years of work with a linguistic start-up working on the problem of
natural language Machine
Hi Rolf,
I'm interested. We talked briefly, and I've worked with SIAI and
Eliezer previously. I would state my pre-Singularity purpose is to
help solve the FAI implementation problem. At the moment I am trying
to understand AI (with questions of Friendliness never far from my
mind), starting a
On Thu, Apr 17, 2008 at 10:00 PM, Rolf Nelson [EMAIL PROTECTED] wrote:
Announcement: fai-logistics is a new archived discussion group to
discuss the most effective ways of aiding the development of Friendly
AI. It is not a discussion group for any other aspects of AGI.
Membership criteria:
13 matches
Mail list logo