I am a third year student pursuing a Bachelors in Computer Science and Engineering, and have been wanting to get into AGI, since, two years may be. I discovered OpenCog, and felt it to be too daunting - like I think I'll require another year or two of study to make good sense of it.
I studied some first language acquisition the last summer (along with a basic Andrew Ng's ML course, another NLPwDL CS 224n from Stanford, and a more rigorous and exhaustive (than the Ng's anyways) Foundations of ML at my own university). Reading about first language acquisition led me to believe that a primary problem is being able to represent the world (with as much details as possible, since dealing with block worlds is easy). So, this is the primary issue. Recently, I spent some time with Natural Semantic Metalanguage, and its criticisms. The concept is definitely ambitious; however, that definitely doesn't seem to be the way our thoughts work. For instance, see explication for "left" <https://linguistics.stackexchange.com/questions/29586/nsm-explication-for-left>; for me, 'left' just evokes a direction than all the other things. May be, it might be useful for actually representing the world in a computer, but an explicit simulation (with which I haven't worked with yet) seems more wieldy. I found SOAR ambitious and more "established" - however, their forums <https://soar.eecs.umich.edu/forum> seemed void. So, just wanted to know if anyone is working on it. Other than that, does their exist some system for representing the world - gaming systems come to mind, but is their some established standard? Thanks! ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T7116d9f5c1f0551d-Me0e50014ca995e0fc095d37b Delivery options: https://agi.topicbox.com/groups/agi/subscription
