It is nonsense to respond to the OP the way you did unless your purpose is
to derail objective metrics of AGI.  I can think of lots of reasons to do
that, not the least of which is you don't want AGI to happen.

On Thu, Mar 28, 2024 at 1:34 PM Quan Tesla <[email protected]> wrote:

> Would you like a sensible response? What's your position on the
> probability of AGI without the fine structure constant?
>
> On Thu, Mar 28, 2024, 18:00 James Bowery <[email protected]> wrote:
>
>> This guy's non sequitur response to my position is so inept as to exclude
>> the possibility that it is a LLM.
>>
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/T5c24d9444d9d9cda-Md86a1a649fab945679571cd5>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5c24d9444d9d9cda-M2c027c8ae3dbb0bd565e11ee
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to