Oh, I don't know... couldn't Drexler's "Gray Goo" be modified to
incorporate Ivan's DNA and be done with it?

On Thu, Jul 13, 2023 at 12:58 PM Matt Mahoney <[email protected]>
wrote:

> On Thu, Jul 13, 2023 at 6:21 AM <[email protected]> wrote:
> >
> > On Thursday, July 13, 2023, at 4:28 AM, Matt Mahoney wrote:
> >
> > Organizing disorganized thoughts begins with a goal. Why build AGI? We
> can group the goals into roughly 4 categories.
> >
> > 1. Scientific curiosity, understanding the brain and consciousness.
> > 2. Automating labor.
> > 3. Uploading, immortality.
> > 4. World domination, launching a singularity, creating utopia.
> >
> > 5. Creating an artificial descendant
> 
> Nice one. Yudkowsky was right. AI will kill us all and we won't even
> put up a fight. But probably not in this century.
> 
> Let's put a timeline on this. Assume Moore's Law continues doubling
> global computing power every 2 years. This is uncertain because clock
> speeds stalled at 2-3 GHz in 2010 and transistor sizes are likely to
> stall this decade because we are close to the ~5 nm spacing limit
> between dopant atoms in silicon. A RAM capacitor stores a bit using 8
> electrons. Further advances will require nanotechnology, moving atoms
> instead of electrons, to solve the power problem. In about 60-70 years
> we will stall at the Landaurer limit 4 zJ per bit operations at room
> temperature, still a 10^9 improvement over transistors.
> 
> 1. The AGI algorithm is mostly understood. LLMs pass the Turing test.
> We understand how neural networks succeeded where symbolic processing
> failed. Language evolved to be efficiently learnable one layer at a
> time in the order of phonemes, word segmentation, semantics, and
> grammar. Symbolic models, like those used for compilers, failed
> because they put grammar before semantics (e.g. how to parse "I ate
> pizza with Bob/olives/a fork"). Fully connected neural networks like
> transformers can learn arbitrarily deep hierarchical concepts like
> mathematics and world models of physics and social interaction. Human
> knowledge is half inherited and half learned (about 10^9 bits each),
> but LLMs can learn the inherited part, like human emotions, from an
> appropriately large enough unlabeled corpus. It knows how to model
> feelings without having feelings. It knows that it is an LLM. It is
> self aware without being conscious, in the sense that it understands
> how humans have an irrefutable sense of being conscious (as part of
> our evolved fear of death), without having this sense itself.
> 
> 2. Automating labor requires more than language. Vision and robotics
> are advancing but not at human level yet. It will take about 30 years
> to reduce the cost of producing a movie from $1M to $10. The value of
> labor is world GDP divided by interest rates, about $1 quadrillion. We
> should expect investment on this scale. Modeling 10^10 human brain
> sized neural networks will require 10^26 OPS, 10^25 parameters, and
> 10^17 bits of human knowledge collected no faster than 5-10 bits per
> second per person at a cost of > $100 trillion. This is slow enough
> for humans to adapt to the changing job market without massive
> unemployment. Instead, AI will make us more productive, improve our
> lives both at work and home, and increase our income. But the big
> change is we will have little need or desire to interact with other
> humans because AI will be far more helpful. You can have everything
> you want, but this is not where happiness comes from. A state of
> maximum utility is static, without feeling.
> 
> 3. We already have the technology and enough personal data to
> construct an LLM that claims to be you, happily living in a virtual
> utopia. All that remains is to construct a world where nobody else
> knows or cares that you exist in a human body.
> 
> 4 and 5. To transform the world, technology has to catch up to
> biology. The biosphere has 10^37 bits of DNA storage and executes
> 10^29 DNA copy and 10^31 amino acid transcription operations per
> second. Human evolution was the result of 10^48 operations over the
> last 10^17 seconds (3 billion years). Photosynthesis generates food at
> a rate of 500 TW, out of  90,000 TW available solar power. We already
> have solar panels that are 30% efficient. Global computing power is
> now about 10^19 OPS and 10^26 bits. At the current rate of Moore's Law
> and with intelligent design, our self replicating, non DNA based
> descendents will be ready to displace DNA based life around 2100.
> 
> --
> -- Matt Mahoney, [email protected]

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T59f369fee7febd6d-M19485ddce6a542f25f2caf46
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to