On Tue, Jun 16, 2015 at 10:32 AM, Logan Streondj <[email protected]> wrote:
> At which point, we'll have to rather seriously consider
> human-level AI and beyond as deserving of human rights.

It concerns me that this is even an issue. Granting human rights to
superior machines that lack human constraints will almost certainly
lead to human extinction. We are designing AGI as tools, as we should,
to do work that we would otherwise have to pay people to do.

> some of the "jobs" on the "least likely to be automated" list,
> seem to be to be much easier to automate now.

The most difficult job to automate will be testing experimental drugs
and medical procedures. Modeling the chemistry of the human body is
much harder than modeling human behavior. We cannot even model simple
chemistry, like calculating the freezing point of water, because it
requires solving Schrodinger's equation, which has exponential time
complexity in the number of particles unless the computation is
performed on a quantum computer. Even a quantum computer cannot do the
computation faster than the system it is simulating. It is faster to
do the actual experiment.

-- 
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to