On 5/29/07, Samantha Atkins <[EMAIL PROTECTED]> wrote:


Without AI or such IA to be almost the same thing I don't have much reason
to believe humanity will see 3007.


*nods* Or rather - in my opinion - it probably will last that long either
way, but the chance of longer term survival might have been blown.
Civilizations don't last forever, and it's an open question whether we'll
get a second shot if we muff the first one. (To take one of the simplest
subquestions: could industrial civilization be sustained or repeated with
the easily-accessible fossil fuel deposits used up? I don't know any way to
confidently answer that question in the affirmative.)

So we'd better not blow this first chance.

And you're right that we'll need a lot more intelligence than we have now,
our current hardware and software aren't enough. Given that an artificial
sentient mind, a _replacement_ for the human mind, isn't feasible this
century (even if you don't agree with that, perhaps you might entertain the
idea that it's at least considerably more difficult than my alternative, and
therefore perhaps we should concentrate our resources on the path that
offers the greater chance of success?), we need therefore to focus on
better, smarter tools to _complement_ the human mind. So while I call what
I'm trying to do AI, you could also call it "such IA to be almost the same
thing".

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to