Alan,

On Sun, Jul 8, 2012 at 9:16 PM, Alan Grimes <[email protected]> wrote:

> Charles Hixson wrote:
> > Perhaps the sub-groups/sub-lists could have the "custom" of marking it
> > in the subject, say if it was about analog AI having the subject start
> > with "[Analog]".  I'm not talking about anything as formal as a
> > guideline, and definitely not a requirement, just something so that
> > those not interested could set their mail filters to mark the posts
> > already read.
>
> When I was a little kid, I was a fan of analog and hybrid computing.
>
> Then I grew up.
>

I still sometimes get a gig designing some strange analog "computing"
electronics. My last such project was an ohmmeter-powered circuit that
lived in telephone trunk lines and monitored gas flow, changing the
"leakage" on a phone line depending on the gas flow. It had to read a
hall-effect flow sensor and emulate old-technology devices that had a
bellows that operated a switch that changed resistors, so a strange
non-linearity was needed. It would use the leakage to slowly charge a
capacitor, then "wake up" for a millisecond every few seconds, read the
flow, compute the new leakage, store it on a capacitor that operated a FET
that affected the leakage, and go back to sleep.

>
> What I want explained is how you get from "analog AI" to symbolic
> thought, which pretty much every human is capable of at some level or
> other.
>

At great risk of jumping Ben's gun, and maybe with a little hope of
affecting Ben's writings, I will answer this from my own prospective.

Theory aside, the great goal of analog computing is to be able to simulate
something in analog, while "twiddling" its independent variables until a
particular goal is achieved. For AI/AGI, we would be simulating reality,
while twiddling the things we could affect, to see if we could achieve a
desired result, e.g. dinner.

Now, suppose that instead of "solving" the differential equations via
incremental simulation, we have a higher-level mathematical capability to
directly solve simpler systems of differential equations, and greatly
extrapolate more complex and radically non-linear systems, to reduce the
computations required by orders of magnitude. The overall "block diagram"
would look somewhat similar to incremental simulation, but the components
would have to be MUCH smarter to go from arithmetic to calculus.

Both ways we would be "solving" the complex systems of simultaneous
differential equations that describe our reality, one way with incremental
numerical methods, and the other way with calculus.

Steve



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to