> ED>> So is the envisioned world is one in which people are on something
> equivalent to a perpetual heroin or crystal meth rush?

Kind of, except it would be safe.

> If so, since most current humans wouldn't have much use for such people, I
> don't know why self-respecting productive human-level AGIs would either.

It would not be supposed to think that way. It does what it's tasked
to do (no matter how smart it is).

> And, if humans had no goals or never thought about intelligence or problems,
> there is no hope they would ever be able to defend themselves from the
> machines.

Our machines would work for us and do everything much better so - no
reason for us to do anything.

> I think it is important to keep people in the loop and substantially in
> control for as long as possible,

My initial thought was the same but if we have narrow AI safety_tools
doing a better job in that area for *very* *very* long time, we will
get convinced that there is simply no need for us being directly
involved.

> at least until we make a transhumanist transition.
> I think it is important that most people have some sort of
> work, even if it is only in helping raise children, taking care of the old,
> governing society, and managing machines.

My thought was in very distant [potential] future. World will change
drastically. There will be no [desire for] children and no "old" (we
will live "forever"). Our cells are currently programed to die - that
code will be rewritten if we stick with cells. The meaning of the term
"society" will change and at certain stage, we will IMO not care about
any concept you can name today. But we better spend more time with
trying to figure out how to design the first powerful AGI at this
stage + how to keep extending our life so WE can make it to those
fairy tale future worlds.

> Freud said work of some sort was
> important, and a lot of people think he was right.

It will be valid for a while :-)

> Even as humans increasingly become more machine through intelligence
> augmentation, we well have problems.  Even if the machines totally take over
> they will have problems.  Shit happens -- even to machines.

Right, but they will be better shit-fighters.

> So I think having more pleasure is good, but trying to have so much pleasure
> that you have no goals, no concern for intelligence, and never think of
> problems is a recipe for certain extinction.

Let's go to an extreme: Imagine being an immortal idiot.. No matter
what you do & how hard you try, the others will be always so much
better in everything that you will eventually become totally
discouraged or even afraid to "touch" anything because it would just
always demonstrate your relative stupidity (/limitations) in some way.
What a life. Suddenly, there is this amazing pleasure machine as a new
god-like-style of living for poor creatures like you. What do you do?

Regards,
Jiri Jelinek


> You know, survival of the
> fittest and all that other boring rot that just happens to dominate reality.
>
> Nirvana? Manyana? Never!
>
> Of course, all this is IMHO.
>
> Ed Porter
>
> P.S. If you ever make one of your groove machines, you could make billions
> with it. ________________________________
>  This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60220603-cef30c

Reply via email to