To be clear, they are aiming for the "general" artificial intelligence now. 
Right?

https://www.youtube.com/watch?v=aQGGSLS8plk&ab_channel=AINews

This new one ("Bloom AI") seems to be a big one. In terms of size, anyway:

 * 176 billion parameters
 * 70 layers, 112 attention heads
 * Hidden layers are 14336-dimensional (sic)
 * Sequence length of 2048 tokens used

(Minute 1 into the video.)

But... I'm sorry guys, but just looking at those bullet points above... 
something tells me this is *not* the Holy Grail of AGI. It just isn't. It's a 
big conclusion machine but its architecture is very rigid and, looking back, we 
will realize that we brute forced a problem instead of looking for more 
flexible architectures.

Have you noticed how a neural network performs the same amount of computation 
every time you feed it with some input? Every input takes the exact same number 
of processing steps. That cannot possibly be the most efficient or even 
effective architecture for general AI. 

Let's be prepared for the revelations of actual next-generation AI.

Cheers
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ted87c91d07178415-M7c7410fc0c7e8f0768fe78cd
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to