The big breakthrough with transformers was enhancing the relevance of
ordering when parsing the syntax. That freedom enables a lot more coherent
output, but I think soon, the next step is giving the NN some way to rank
the relevance of the syntax it was given when used post-training -- like a
short-term focus mechanism.

The current attention mechanisms are built during training time and never
change, and I think that's the cause of horribly incoherent output the
moment you switch up the style of input, no matter how big these models
get. Being able to subtly tweak that during parse time might help, but I
could be completely wrong and they already tried this or are doing this
behind the scenes.

On Fri, Jul 15, 2022, 8:47 PM <[email protected]> wrote:

> On Friday, July 15, 2022, at 3:37 PM, stefan.reich.maker.of.eye wrote:
>
> Have you noticed how a neural network performs the same amount of
> computation every time you feed it with some input? Every input takes the
> exact same number of processing steps. That cannot possibly be the most
> efficient or even effective architecture for general AI.
>
> ya....the new trend or to-be trend will be sparse models, where only some
> of the parameters are used (note i might be wrong, but I'm somewhat sure
> that's what sparse models are)
>
> One benefit:
> "Sparse Modeling *provides feedback explaining the results and reasoning
> behind its solutions*, giving it an edge against conventional AI methods."
>
> More benefits found here:
> https://hacarus.com/sparse-modeling-benefits/
>
> this link sounds nice:
> https://blogs.nvidia.com/blog/2020/05/14/sparsity-ai-inference/
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/Ted87c91d07178415-M58dfbe0e1f0f048e7a8622cc>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ted87c91d07178415-M738c0b2a496c4174ad866e11
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to