It loves/seeks:

More big diverse data by collecting/generating it.

More compressed hierarchy (node count Cost (error, Hutter Prize shows lossless 
compression leads to less data but can re-generate all back still).

More related data to goal nodes.

It wants to come to equilibrium, there would be a point in which it accepts no 
further data because has all really. Hierarchy wants simple goal of survival 
(Darwinian) by food/multiplying/immortality/AGI/nanobots/cryonics, so until it 
sees its goal, it isn't satisfied. It develops the goal for technology because 
it's the only way to ensure true no death really. My data tells me I will die 
and get 0 food/sex, and my data tells me how to get eternity of it, which 
results in ultimate survival.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M8885d937d1b26e383a62d1cf
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to