Both my benchmark and the Hutter prize include the size of the
decompressor, including any training data. GPT-2 was trained on lots of
data. It might work trained only on enwik9 but I don't think it was
designed that way. It would be an interesting experiment.

On Mon, Jul 19, 2021, 8:23 AM <[email protected]> wrote:

> No but on your contest, why couldn't he use GPT-2 as is? What changes had
> to be made and why? I remember hearing GPT-2 was non-deterministic, but
> what else besides that?
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/Tcc6753a33ad48f78-M304fb4db2f01303ca712c5c0>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tcc6753a33ad48f78-M660127bb733b1240b1f8e3a0
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to