I do not want to parse everything imortal.discoveries said, but my feeling is 
that as an AI program learns more, it will need to keep relatively more 
specialized data and it will need to create more 'indexes' (or something that 
acts like an index) into the data.  So it will create an exponential growth 
(not a simple exponential growth but like an exponential growth) for both the 
size of the data and the time it takes to find data. There are two ways I would 
like to deal with this. One is by using various kinds of generalizations as 
'indexes' into the more detailed information.  The second is to find new kinds 
of networks to represent relationships of data. I feel that the first is easy 
to imagine but the second really requires something novel.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tfce80000509c1fb3-M5c0a91d3ce4b9f70f339e699
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to