https://www.nature.com/articles/s42256-019-0026-3

Generative descriptions require general recursion hence
self-reference.  Another way of talking about these different kinds of
languages is the Chomsky Hierarchy of languages or "grammars" --
specifically the distinction between type-0 and other languages,
type-0 going by the name "recursively enumerable".  Recurrent Neural
Networks are adequate since they, by definition, permit
self-reference.  They can (in theory, given unlimited memory) be
Universal Turing Machines.


On Mon, Oct 19, 2020 at 8:24 PM <[email protected]> wrote:
>
> Is the "one that disallows self-reference." the Shannon one? Shannon is 
> better?
>
> And does "disallows self-reference" mean they are dong Evaluation using 
> training+testing sets, unlike the Hutter Prize which focuses on true 
> Compression?
> Artificial General Intelligence List / AGI / see discussions + participants + 
> delivery options Permalink

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T6761a13445e5864b-Me220ff1f9f3a426daa27ac51
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to