On Wed, Dec 4, 2019, 7:19 AM <[email protected]> wrote:

> Adam and Eve
> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2813846/
>
> Dr. Eliza:
> http://dreliza.com/selfselection.php
>
> GPT-2:
> https://openai.com/blog/better-language-models/
>

Interesting that GPT-2 claims to improve compression of enwik8 from 0.99 to
0.93 bits per character. The Hutter prize best result is 1.22 bpc. The best
result on the large text benchmark (with no hardware restrictions) is 1.18
bpc by cmix. Of course those include the size of the decompressor.


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T7116d9f5c1f0551d-Meedf8e65ae3da9570469f207
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to