What I’m saying is:
Blender can be forced to incorporate certain topics into what it says. So it
talks about ex. cars all the time, no matter what. Humans do too, but evolve
it. They start off wanting food or mom, then they can discover food = farming
semantically and now talk about farming a lot more. This agenda/persona
updates/specializes, into a narrow domain. It evolves its question. Output of
AGI controls input source to collect/generate data from. > It decides which lab
tests to try, and those determine which next lab tests to try. At first, input
source is random data collection, just like those robots that learn to walk.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T3cd584667cb2384b-M5470c8bedf7a4ddee4f17525
Delivery options: https://agi.topicbox.com/groups/agi/subscription