Steve, I am adjusting your statement to add what I think is missing: > >> >> "any prospective AGI platform absolutely **MUST** be capable of *rapidly >> learning to* perform*ing* substantially all of the high-level cognitive >> information processing functions that have been observed in human >> mind/brains *without carefully ignoring areas (like the hypothalamus) >> that perform functions that appear incompatible with the platform.*" >> > > The hypothalamus is not a function, but rather a system. So to accord with my statement you would need to enumerate which of the human mind's high-level cognitive functions you think OpenCog (or other AGi designs) ignores due to not adequately including sufficiently "hypothalamus-like" components or processes...
Also, I don't agree at all that an AGi must be capable fo rapidly learning to perform all its high-level functions. A human mind learns to cognize over a period of years, and does so via a complex combination of learning with the scheduled/triggered unveiling of genetically encoded capabilities.... Similarly I think it's OK if an AGi learns its cognitive capabilities over a period of years, and if it leverages some appropriately in-built capabilities. A human mind is not a tabula rasa, and nor need an AGI mind be... -- Ben G ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
