i hope they use this in gpt-4.
Infinite Memory Transformer: Attending to Arbitrarily Long Contexts Without
Increasing Computation Burden | Synced (syncedreview.com)
<https://syncedreview.com/2021/09/09/deepmind-podracer-tpu-based-rl-frameworks-deliver-exceptional-performance-at-low-cost-100/>

On Thu, Sep 9, 2021 at 9:34 PM <[email protected]> wrote:

> if you want to make things really easy for Isis and japan,  go ahead,
> give it all away.
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/T3b45ab6955eb04e7-M2b3ad9b21d8605fbcfb0410b>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T3b45ab6955eb04e7-Mb2dfaf2fe52f7dbdd718a848
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to