About...

On 10/24/06, Hank Conn <[EMAIL PROTECTED]> wrote:
About de Garis... I feel the same as others on this list have
expressed... in that he is definitively loony.

I also have very strong doubts about Kurzweil's model of how the Singularity
is going to unfold.

I know Hugo de Garis pretty well personally, and I can tell you that
he is certainly not "loony" on a personal level, as a human being.
He's a bit eccentric, but he actually has a very solid understanding
of the everyday world as well as of many branches of science....  His
"reality discrimination" faculty is excellent, which discriminates him
from the loonies of the world...  He is however a bit of a showman --
it can be striking how his persona changes when he shifts from private
conversation to speaking on-camera or in front of a crowd...

Regarding his prognostications in the Artilect War book, I don't agree
with the confidence with which he puts them forth; but, I also think
Ray Kurzweil is sometimes highly overconfident regarding his
predictions.

Both de Garis and Kurzweil share an intuition that the Singularity
will arise via a "soft takeoff" scenario.  They each argue for a
different possible consequence of the soft takeoff,

-- Kurzweil: ultratech becomes part of all of our lives, so much that
we take it for granted, and the transition from the human to posthuman
era is seamless

-- De Garis: ultratech polarizes society, with some humans embracing
it and others rejecting it, and a massive war ensues

So far as I can tell, conditional on the hypothesis of a soft takeoff,
both possibilities are palpable, and I don't know how to estimate the
odds of either one.

Further, both De Garis and Kurzweil argue that AGI is likely to be
achieved, first, through human brain emulation.  [De Garis is actively
working to help with this, by creating firmware platforms for
large-scale neural net emulation; whereas Kurzweil is not actively
involved in research in this area right now so far as I know.]

Equally interesting is the debate btw soft and hard takeoff, but, this
is not brought up by the De Garis vs. Kurzweil contrast, as they both
agree on this point.

-- Ben G





Here's what Eliezer had to say 4 years ago about Kurzweil... (I imagine this
is horribly obsolete in many ways, like everything else. Especially given
that Kurzweil donated something like 15 grand to SIAI a while back).
http://www.sl4.org/archive/0206/4015.html

I also think if you are expecting the Singularity in 2029 or after, you
might be in for quite an early surprise.

Ugh.. the poll on the website says "Whose vision do you believe: Kurzweil or
de Garis?" ... lol


Interesting news though.


On 10/24/06, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>
>
http://www.bbc.co.uk/sn/tvradio/programmes/horizon/broadband/tx/singularity/
>
> Tuesday 24 October 2006, 9pm on BBC Two
>
> "Meet the scientific prophets who claim we are on the verge of
> creating a new type of human - a human v2.0.
>
> "It's predicted that by 2029 computer intelligence will equal the
> power of the human brain. Some believe this will revolutionise
> humanity - we will be able to download our minds to computers
> extending our lives indefinitely. Others fear this will lead to
> oblivion by giving rise to destructive ultra intelligent machines.
>
> "One thing they all agree on is that the coming of this moment - and
> whatever it brings - is inevitable."
>
> -----
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
>
http://v2.listbox.com/member/[EMAIL PROTECTED]
>

 ________________________________
 This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe
or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to