Maybe because philosophy isn't real science, and Oxford decided FHI's
funding would be better off spent elsewhere. You could argue that
existential risk of human extinction is important, but browsing their list
of papers doesn't give me a good feeling that they have produced anything
important besides talk. What hypotheses have they tested?

Is MIRI next? It seems like they are just getting in the way of progress
and hurting the profits of their high tech billionaire backers.

Where are the predictions of population collapse because people are
spending more time on their phones instead of making babies?

On Sat, Apr 20, 2024, 1:27 PM James Bowery <[email protected]> wrote:

> Is there quasi-journalistic synopsis of what happened to cause it to
> receive "headwinds"?  Is "Facebook" involved or just "some people on"
> Facebook?  And what was their motivation -- sans identity?
>
> On Fri, Apr 19, 2024 at 6:28 PM Mike Archbold <[email protected]> wrote:
>
>> Some people on facebook are spiking the ball... I guess I won't say who ;)
>>
>> On Fri, Apr 19, 2024 at 4:03 PM Matt Mahoney <[email protected]>
>> wrote:
>>
>>> https://www.futureofhumanityinstitute.org/
>>>
>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/Te0da187fd19737a7-M0b09cbb73e0bffe5e677f043>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te0da187fd19737a7-Mf6feb4f8bea607b7aed11189
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to