> AGI does not need promoting.

Considering
a) how important AGI is
b) how many dev teams seriously work on AGI
c) how many investors are willing to spend good money on AGI R&D
I believe AGI does need promoting. And it's IMO similar with the
immortality research some of the Novamente folks are involved in. It's
just unbelievable how much money (and other resources) are being used
for all kinds of nonsense/insignificant projects worldwide. I wish
every American gave just $1 for AGI and $1 for immortality research.
Imagine what this money could for all of us (if used wisely).
Unfortunately, people will rather spend the money for their popcorn in
a cinema.

> We should be more concerned about the risks of AGI.  When humans can make
> machines smarter than themselves, then so can those machines.  The result will
> be an intelligence explosion.  http://mindstalk.net/vinge/vinge-sing.html

I'll check your links later, but generally, we can avoid many risks by
controlling AGI's goal system - which should not be that difficult if
the AGI is well designed.

> The problem is that humans cannot predict -- and therefore cannot control --
> machines that are vastly smarter.

"cannot predict" - I agree.
"cannot control" - I disagree. Controlling goals, subgoals, and the
real world impact (possibly using independent narrow AI tools) will do
the trick.

>  Could your consciousness exist in a machine with different goals or different
> memories?

IMO no.

> Do you become the godlike intelligence that replaces the human
> race?

Godlike intelligence? :) Ok, here is how I see it: If we survive, I
believe we will eventually get plugged into some sort of pleasure
machine and we will not care about intelligence at all. Intelligence
is a useless tool when there are no problems and no goals to think
about. We don't really want any goals/problems in our minds.
Basically, the goal is to not have goal(s) and safely experience as
intense pleasure as the available design allows for as long as
possible. AGI could be eventually tasked to take care of all what that
takes + search for the system improvements and things that an altered
human mind could consider being even better than feelings as we know
them now. Many might think that they love someone so much that they
would not tell him/her "bye" and get plugged into a pleasure machine,
but I'm pretty sure they would change their mind after the first trial
of a well designed device of that kind. That's how I currently see the
best possible future. Some people, when talking about advanced aliens,
are asking "Where are they?".. Possibly, they are in such a pleasure
machine and don't really care about anything, feeling like true gods
in a world where concepts like intelligence are totally meaningless.

Regards,
Jiri Jelinek

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=59810742-14fd96

Reply via email to