Re: [agi] FHI is shutting down

2024-04-21 Thread Alan Grimes via AGI
Matt Mahoney wrote: Maybe because philosophy isn't real science, and Oxford decided FHI's funding would be better off spent elsewhere. You could argue that existential risk of human extinction is important, but browsing their list of papers doesn't give me a good feeling that they have

Re: [agi] α, αGproton, Combinatorial Hierarchy, Computational Irreducibility and other things that just don't matter to reaching AGI

2024-04-21 Thread John Rose
If the fine structure constant was tunable across different hypothetical universes how would that affect the overall intelligence of each universe? Dive into that rabbit hole, express and/or algorithmicize the intelligence of a universe. There are several potential ways to do that, some of