On Friday, July 15, 2022, at 3:37 PM, stefan.reich.maker.of.eye wrote:
> Have you noticed how a neural network performs the same amount of computation 
> every time you feed it with some input? Every input takes the exact same 
> number of processing steps. That cannot possibly be the most efficient or 
> even effective architecture for general AI. 
> 
ya....the new trend or to-be trend will be sparse models, where only some of 
the parameters are used (note i might be wrong, but I'm somewhat sure that's 
what sparse models are)

One benefit:
"Sparse Modeling *provides feedback explaining the results and reasoning behind 
its solutions*, giving it an edge against conventional AI methods."

More benefits found here:
https://hacarus.com/sparse-modeling-benefits/

this link sounds nice:
https://blogs.nvidia.com/blog/2020/05/14/sparsity-ai-inference/
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ted87c91d07178415-M58dfbe0e1f0f048e7a8622cc
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to