https://ai.googleblog.com/2022/01/google-research-themes-from-2021-and.html#Trend1



"and sparse models such as Google’s 600B parameter GShard 
<https://arxiv.org/abs/2006.16668> model and 1.2T parameter GLaM 
<https://ai.googleblog.com/2021/12/more-efficient-in-context-learning-with.html>
 model)"

I thought James said Google didn't do sparse models???:

>>> "Where is Google's big investment in sparse matrix multiplication hardware, 
>>> for example?  Why the reliance on dense models when it is known that's not 
>>> how the brain works?"

So if there's at least not sparse-based hardware, maybe because it's too new or 
costly for "some reason", assuming James is right they don't have Sparse-based 
Hardware.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tffc785d7f1f92961-M6e345c36fa902273d2db7b31
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to