j.k. wrote:
On 12/20/2007 07:56 PM,, Richard Loosemore wrote:
[snip]
The other significant mistake that people make is to think that it is
possible to speculate about how an AGI would function without first
having at least a reasonably clear idea about how minds in general are
supposed to function.  Why?  Because too often you hear comments like
"An AGI *would* probably do [x].....", when in fact the person speaking
knows so little about about how minds (human or other) really work, that
all they can really say is "I have a vague hunch that maybe an AGI might
do [x], although I can't really say why it would...."
I do not mean to personally criticise anyone for their lack of
knowledge of minds, when I say this.  What I do criticise is the lack of
caution, as when someone says "it would" when they should say "there is
a chance  that it might"
The problem is, that 90% of everthing said about AGIs on this list
falls into that trap.

I agree that there seems to be overconfidence in the inevitability of
things turning out the way it is hoped they will turn out, and lack of
appreciation for the unknowns and the unknown unknowns. It's hardly
unique to this list though to not recognize the contingent nature of
things turning out the way they do.

-joseph

Actually, what I meant to target was not so much the unknown future as the specific technical details of thinking systems. A great deal is known about how these are structured, but the fact that people don't know much about that technical detail doesn't seem to stop them making astonishingly detailed pronouncements about how future minds "would" behave.

Strange really: they probably would not be so quick to make such pronouncements about, say, quantum mechanics or planetary geology or the mating habits of drosophila, but when it comes to thinking systems, everyone's an expert without even trying.




Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=78391050-9702cb

Reply via email to