"People are naturally attracted to the idea that "first something is
expensive, then it gets cheaper" — as if AI is a single thing of constant
quality, and when it gets cheaper, we'll use fewer chips to train it. But
what's important is the *scaling curve*: when it shifts, we simply traverse
it faster, because the value of what's at the end of the curve is so high.
[...]  Making AI that is smarter than almost all humans at almost all
things will require millions of chips, tens of billions of dollars (at
least), and is most likely to happen in 2026-2027. [...] Even if the US and
China were at parity in AI systems, it seems likely that China could direct
more talent, capital, and focus to military applications of the technology.
Combined with its large industrial base and military-strategic advantages,
this could help China take a commanding lead on the global stage, not just
for AI but for everything. If China can't get millions of chips, we'll (at
least temporarily) live in a unipolar world, where only the US and its
allies have these models. It's unclear whether the unipolar world will
last, but there's at least the possibility that, *because AI systems can
eventually help make even smarter AI systems, a temporary lead could be
parlayed into a durable advantage."*

*On DeepSeek and Export Controls
<https://darioamodei.com/on-deepseek-and-export-controls>*

*The above was written by Dario Amodei, the head of Anthropic *
===========

*Are we close to an intelligence explosion?*
<https://futureoflife.org/ai/are-we-close-to-an-intelligence-explosion/>

*From the above: *

"We have every reason to expect that AI systems will eventually surpass
human level at every cognitive task. One such task is AI research itself.
This is why many have speculated that AIs will eventually enter a phase of
recursive self-improvement."

*John K Clark    See what's on my new list at  Extropolis
<https://groups.google.com/g/extropolis>*

r$b

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv2OohBvoGELrpiAPVDLXHE0veUyW881Y_rLkdRLYQqfMA%40mail.gmail.com.

Reply via email to