>
>
> <https://agi.topicbox.com/groups/agi/Te8aae875ccd49383-M79593194c7bd98911f81f4a4>
>
The alignment problem has to address two threats: AI controlled by people
and AI not controlled by people. Most of our attention has been on the
second type even though it is a century away at the current rate of Moore's
law. Self replicating nanotechnology will be a threat when its computing
capacity exceeds that of DNA based life. That can happen because plants
currently use only 0.3% of available sunlight (90,000 terawatts) to make
carbohydrates (210 billion tons of carbon per year, or 20% of the
biosphere, at 4 Kcal/g). Solar panels already achieve 20-30% efficiency.

Assuming that global computing capacity doubles every 2 years, it will take
a century for the current 10^24 bits of storage capacity to match the 10^37
bits stored in all the world's DNA. We are also far below the computing
power of 10^29 DNA copy and 10^31 amino acid transcription operations per
second.

The kind of AI that we need to worry about now is the kind that gives us
everything we want, or at least everything that the owners of the AI want.
When your work no longer has value because machines can do it better, then
your only sources of income will be the AI that you own, your personal
information (for training AI), and government assistance. Your personal
information only has value in proportion to your buying power, thus
widening the power law distribution of wealth that is necessary to make an
economy work. It takes money to make money.

Income redistribution through taxes and benefits only solves part of the
problem. When you don't need other people, they don't need you either, or
know or care that you exist. When it is easier, safer, and more convenient
to live alone in our private virtual worlds, we stop having children and
lose our ability to communicate with other people even if we wanted to. We
are evolving short term to a mostly African and Muslim population and
longer term to a population that rejects technology, birth control, and
women's rights, provided we don't go extinct first. That will slow down
Moore's law before we have to worry about the other type of AI.




> k
> <https://agi.topicbox.com/groups/agi/Te8aae875ccd49383-M79593194c7bd98911f81f4a4>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te8aae875ccd49383-M71b3c193d1ee58acd4bef862
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to