They can make the code [and] computer faster/ bigger but the new_data_creation 
from the same dataset size is key to making it smarter 
(induction...semantics....generalization). Now, you may find ways to use *less 
data* and be smarter !, and I'll give an example, but technically that simply 
means there's a LOT of "data" conveyed in that item. For example all you do is 
look at the last 20 words and see cat cat cat...so cat is 99.99% likely next. 
And you barely used big data here!!!!!
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T68d1730d849e1bd8-Mb4a0e3ac608a357e80e4cf54
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to