The goal of evolution is to maximize population growth. Your goal is
immortality because animals that fear death and then die have more
offspring.

On Tue, Jul 25, 2023, 4:50 AM <[email protected]> wrote:

> On Monday, July 24, 2023, at 11:05 PM, Matt Mahoney wrote:
>
> 4. Reproduction is a requirement for life, not for intelligence. Not
> unless you want to measure reproductive fitness as intelligence like
> evolution does. Is gray goo our goal?
>
>
> It takes longest but is most accurate for measuring how smart we are. That
> is because the end goal is immortality and persistence of some future
> machine, and so if we really are smart and succeed, this score should be
> high. However, if brains are smart enough to discover knowledge or
> functions that makes them score better on prediction tests but doesn't
> improve immortality, then yes technically they can be smarter without
> increasing the immortality test's score I think maybe. *Do we though
> "need" brains to "know/waste energy on" stuff like "some thing that does
> not improve lifespan or fleet size"? We'd not want that actually.*
>
> GPT-4 right now can earn 500USD for junior programming jobs. The money
> test for AI we can do now, it was not often that back then you have a AI
> like GPT-4 that can generate solutions for jobs. Maybe very narrow
> programs, but nothing like GPT-4. The one famous guy that I think I forget
> he was/is CEO of Google or something but anyway he said in 2 years he
> predicts AI will bring in 1 million dollars after devises a research plan
> and sells something all on its own. *GPT-5 I imagine would be able to be
> asked to "make a new programming language easier than C++ but better than
> C++ and easier then Python" and from just that prompt it would go on to do
> all steps on its own (a ton of little problems it can solve on its own) and
> burn through nights and days (possibly hours or even minutes in its fast
> brain) tirelessly and hand back the whole software. *
>
> Generation of senses and actions. This one we know already is a fun and
> useful measure. It's for humans to compare to AIs only, AIs comparing to
> other AIs they made need not this anymore, it's too subjective.
>
> Lastly Prediction Score, not comparable to us, but to other AIs, a very
> solid way to improve AIs. *However it doesn't tell us long term if it
> really worked or made things better resolution or more coherent etc. Other
> tests are needed then.*
>
>
>
> I tend to feel, as we go up the 4 paragraphs above, it gets more long-term
> horizon effect no? Money takes longer to measure, you need do many many
> many steps, then sell the final product! So, way 3 takes longer but not
> like top way4, that is much longer to wait for. Prediction score is fast,
> it seems it doesn't exactly tell you much about even what your AI makes,
> maybe. We currently use most the 1st bottom way to make AI, I'd imagine.
>
> What test to test for AGI? Way1 can't unless Matt is really sure about
> humans can compress what he said to 1bpc. Yes way2 and way3 already show we
> are closeish to human level.
>
> Have to think on this more but for now...:
> But what really to look for? What is the real thing we want? Not only do
> we want AI to work on AI, but we know humans - only smart humans - can work
> on AI. So, the test should maybe be if we can make an AI that can work on
> AI and have the AI use evaluations I listed above, it must be therefore
> human level AI, and, would also be what we want to see happen, too, hehe :).
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/T91a503beb3f94ef7-Mb549e6c37e9761668fda9896>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T91a503beb3f94ef7-M447545b70a1fe0ee2a319caf
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to