Matt Mahoney wrote:
--- Stan Nilsen <[EMAIL PROTECTED]> wrote:
Reprogramming humans doesn't appear to be an option.
We do it all the time. It is called "school".
I might be tempted to call this "manipulation" rather than programming.
The results of schooling are questionable while programming will
produce an expected result if the method is sound.
Less commonly, the mentally ill are forced to take drugs or treatment "for
their own good". Most notably, this includes drug addicts. Also, it is
common practice to give hospital and nursing home patients tranquilizers to
make less work for the staff.
Note that the definition of "mentally ill" is subject to change. Alan Turing
was required by court order to take female hormones to "cure" his
homosexuality, and committed suicide shortly afterwards.
Reprogramming the
AGI of the future might be IF the designers build in the right
mechanisms for an effective oversight of the units.
We only get to program the first generation of AGI. Programming subsequent
generations will be up to their parents. They will be too complex for us to
do it.
Is there a reason to believe that a fledgling AGI will be proficient
right from the start? It's easy to jump from AGI #1 to an AGI 10 years
down the road and presume these fantastic capabilities. Even if the AGI
can spend millions of cycles "ingesting" the Internet, won't it find
thousands of difficult problems that might challenge it? Hard problems
don't just dissolve when you apply resources. The point here is that
control and domination of humans may not be very high on priority list.
Do you think this "older" AGI will have an interest in trying to control
other AGI that might come on the scene? I suspect that they will, and
they might see fit to design their offspring with an oversight interface.
In part, my contention is that AGI will not automatically agree with one
another - do smart people necessarily come to the same opinion? Or does
AGI existence mean no longer there are "opinions", only facts since
these units grasp everything correctly?
Science fiction aside, there may be a slow transition to AGI into
society - remember that the G in AGI means general, not born with "stock
market manipulation" capability (unless it mimics the General
population, in which case, good luck.)
-- Matt Mahoney, [EMAIL PROTECTED]
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: http://www.listbox.com/member/?&
Powered by Listbox: http://www.listbox.com
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com