On Tue, Sep 26, 2023 at 3:27 PM Mike Archbold <[email protected]> wrote:
> A lot of these people that are suddenly piling on the bandwagon to AGI -- > if they say it doesn't take a lot of compute, not going to use LLMs, where > were they the last 20+ years? > Wow.... how many layers must I slice through here? 1) Who is "suddenly piling onto AGI" saying it doesn't take "a lot" of compute? 2) "A lot of compute" must be compared to the current "warehouses" of compute Carmack says he doesn't think is necessary. Indeed, he thinks you have to be wealthy to "take a stab at" the current opportunity but that *exponential* increase in hardware cost effectiveness will eventually bring the cost down to grad student level at which point the opportunity will be gone for the wealthy. 3) Are you perhaps thinking of Hutter's strict limit on compute for his prize as though he is someone who is "suddenly piling on the bandwagon to AGI"? All of a sudden like over 20 years ago? And is Hutter even making that claim? No. He's simply saying that "The Hardware Lottery" may be misdirecting *research* in machine learning. He's never stated that Matt's Large Text Compression Benchmark, which is unconstrained in "compute" is a bad direction in *development* of a language model -- however profligate it may be in compute. 4) Where was Carmack for the last 20+ years? Making money so he is in a position to make this comment: “I’m honestly surprised that there aren’t more people … I mean there are hundreds of people, like me, that were technical people that, you know, that succeeded, sold companies that have the resources to go and apply this, and I’m kind of surprised that there aren’t more of them taking small stabs at it… the expected value of that if you have no fear of ruin is quite significant.” ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Td500c6985165fa72-Mbb902ed3b3a9454a6c6dda8c Delivery options: https://agi.topicbox.com/groups/agi/subscription
