You might consider Ted (Xanadu Hypertext) Nelson's "Embedded Markup
Considered Harmful" as a clue as to how to win the top slot in the The
Hutter Prize (rules here <http://prize.hutter1.net/hrules.htm>) or, failing
that, Matt Mahoney's Large Text Compression Benchmark (rules here
<http://mattmahoney.net/dc/textrules.html>):

Factor out the markup from the plain text with a mapping structure to
retain the information.  Then apply your advanced natural language modeling
technology to compressing the plain text and compress the markup -> plain
text mapping structure separately using techniques specifically for that
purpose.

On Wed, Dec 18, 2019 at 10:18 PM YKY (Yan King Yin, ็”„ๆ™ฏ่ดค) <
[email protected]> wrote:

> This set of slides were written in September, just after the AGI 2019
> Conference in China, but I only got time to translate them into English
> today:
>
> English version:
> https://drive.google.com/open?id=1J9_rihrWWXvQE1-wTz5iXOXhdnHpK7Wx
>
> Chinese version:
> https://drive.google.com/open?id=1IGfRaUc-uSSEca2D-mwF5T05JRfxUVg7
>
> One contribution I'm quite proud of is that reinforcement learning can be
> viewed as solving the Schrodinger equation.  Though it may not be of very
> high practical value ๐Ÿ˜†
>
> I'm currently designing an AGI architecture borrowing ideas from Google's
> BERT.  I will explain this in another set of slides.
>
> Comments and suggestions welcome ๐Ÿ˜Š
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Tcfe7cc93841eec23-M5e1415728f77e7318ae0daed>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tcfe7cc93841eec23-Mfdc41d604fda148d4474f097
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to