Yes, the iteration you mention is important... so at each stage in its life, the AGI is doing Occam's Razor according to the "programming language" implicit in what it's learned so far...
This is a matter of learning "computational models / programming languages" on the fly, that are specifically adapted to the environment of a given intelligent system, and the internal structures emergent within that system as it learns/grows/develops... And it may be that the "base", the initial measure of simplicity used at the start of this iteration, is not all that relevant to the end result -- so long as the base is something reasonable, and not totally out of synch with the environment, goals and architecture of the system... However, that doesn't eliminate the theoretical/mathematical/philosophical question I posted in that blog post... Please note: I don't think that theoretical/mathematical/philosophical question needs to be solved in order to create AGI. OpenCog already embodies its own choice regarding the base language for measuring simplicity, the Atom language... I just think it's an interesting question, anyway... And it may be relevant ultimately to creating a good general theory of general intelligence, which we lack now... However, I strongly suspect we can succeed at building powerful AGI without needing to have a good general theory of general intelligence beforehand.. ben On Mon, Aug 27, 2012 at 12:37 PM, Boris Kazachenko <[email protected]> wrote: >> Just some speculations about possible theoretical computer science I'd >> do if I had the time ;p >> >> >> http://multiverseaccordingtoben.blogspot.com/2012/08/finding-right-computational-model-to.html > > > Maybe you should spend your time more wisely :). > > All this confusion results from using languages designed for irrelevant > tasks. For GI, the language must be generated by the compressive Occam's > Razor algorithm itself, with incremental syntax produced by its past > iterations. Notice the POV difference: you don't have some fixed & final > complexity for GI to reduce. Rather, you go through indefinite number of > complexity accumulation / compression cycles. In my terms, that's a > current-level search / next-level evaluation cycle, iterated for as long as > you keep accumulating the data. > > > ------------------------------------------- > AGI > Archives: https://www.listbox.com/member/archive/303/=now > RSS Feed: https://www.listbox.com/member/archive/rss/303/212726-11ac2389 > Modify Your Subscription: > https://www.listbox.com/member/?& > Powered by Listbox: http://www.listbox.com -- Ben Goertzel, PhD http://goertzel.org "My humanity is a constant self-overcoming" -- Friedrich Nietzsche ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968 Powered by Listbox: http://www.listbox.com
