Ben, On Mon, Jul 9, 2012 at 7:35 AM, Ben Goertzel <[email protected]> wrote:
> > On Mon, Jul 9, 2012 at 10:23 PM, Steve Richfield < > [email protected]> wrote: > >> Hi all, >> >> I wonder whether most people here grok that the world is fundamentally a >> gargantuan set of simultaneous nonlinear differential equations for us to >> solve in order to achieve our goals? >> > > We all know that is the way physics conventionally models the world... but > not all of us think that's the most useful model to use for AGI purposes > > One could just as well say "the world is fundamentally a gargantuan > computer program" -- this view is also consistent with all known physics... > There are some BIG weaknesses in this POV: 1. Differential equations describe a system. Errors in the equations map to errors in the system being described, and can be directly corrected by adjusting the equations to eliminate the differences between simulated/solved and observed results. Of course, computer programs can be similarly adjusted, but ONLY when they are also in effect paralleling reality, and so we are comparing like things. 2. Perhaps you can explain. I know of no way of making computer code consistent with known physics, except through EXTREME programming effort that in effect makes it solve the same differential equations, albeit potentially by programmers who don't even realize what they are doing. Again, we are comparing like things. In short, sure you can do pretty much the same with a computer program - which is EXACTLY not only what I am proposing, but my proposal leads fairly directly to the detailed mechanisms needed to accomplish this. Wonderful how great minds think alike. It appears to me that the primary point of failure in our discussions is that you haven't yet "seen the forest for the trees", that in the long distant future when you FINALLY get your code working the way you want it to work, that it absolutely MUST in effect be learning and solving vast systems of differential equations, regardless of whether or not its programmers ever realize it. Of course, realizing what you are doing goes a LONG way in directing your efforts in a productive direction. The (only?) remaining wrinkle in what I have been saying is that Von Neumann style computers are REALLY REALLY poor at solving systems of differential equations, even arguably worse than old vacuum tube analog computers, and hence similarly poor at executing software that "pretends" to solve differential equations, even when it is written by programmers who don't (yet) realize what they are trying to do. Sticking your fingers in your ears while loudly saying "NO NO NO" while ignoring the math, doesn't alter the fact that you are in effect trying to write code that solves systems of differential equations. Of course, people who ignore the underlying math are usually doomed by it. Do you know of any exceptions to this rule? There are other radically different architectures that natively solve systems of differential equations, so why pretend when you can more easily do the real thing, unless you are simply trying to quickly produce brain-dead demonstrations of pseudo-AGI, akin to the original Eliza? We could simulate such a machine on current hardware. The simulation would be slow, but at least it would run at SOME non-zero speed, so that real and useful research could begin. Agreement aside, I haven't yet heard words suggesting that you have grokked what I have been saying. Perhaps you can better relate to Sergio's POV in his posting, made while I was writing this. Right now, all I am looking for is an indication that you understand what Sergio and I have been saying. Only then can we carry on a productive conversation about it. Steve ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968 Powered by Listbox: http://www.listbox.com
