|
I have had a thought. You probably get them all the
time but for myself they are rather rare. I mostly just run
programs. Okay, here is the thought:
The possibilities of types of MIND are very broad,
perhaps infinite. What we are all trying to create (AGI) is our kind of
mind or some specific subset of our kind of mind. What is
intelligence? Ben:"The ability to solve complex problems in a complex
environment." What is OUR intelligence? "The ability to solve
OUR complex problems in OUR complex environment." What is OUR
environment? Three dimensional space/time operating according to a
somewhat unpredictable set of functions, vast abyss with occasional galaxies,
our solar system with four gas giants and some scattered debris, our planet
Earth, biosphere, intelligent species, various languages culture politics
religion economics governments communities THE INTERNET. The possibilities
of types of ENVIRONMENT are very broad, perhaps infinite. Just as there is
no "general" environment, there is no "general" intelligence. A mind must
be matched to its environment. In the 2d 64 square environment operated by
the rules of chess, Deep Blue is a strong intelligence. An artificial mind
that has been programmed with all of our knowledge about our environment and all
of our skills at problem solving our problems will be like us except for the
advantages supplied by the machine hardware. These advantages are limited
in number and very specifically describable and predictable and
understandable. There is ONE general organizational structure that
optimizes this AGI for our environment. All deviations from the one design
only serve to make the AGI function less effectively. Any significant
departures cease to function in any way we would consider
intelligent. The SAI of the future will be vastly more intelligent,
powerful, and amazing. It will not be incomprehensible. It will be a
lot like us.
Mike Deering.
|
- [agi] Optimal minds for specific environments Mike Deering
- [agi] Optimal minds for specific environments Philip Sutton
- Re: [agi] A thought. Shane Legg
- Re: [agi] A thought. James Rogers
- Re: [agi] A thought. Eliezer S. Yudkowsky
- RE: [agi] A thought. Ben Goertzel
- Re: [agi] A thought. Brad Wyble
- RE: [agi] A thought. Ben Goertzel
- Re: [agi] A thought. Brad Wyble
- Re: [agi] A thought. Philip Sutton
- Re: [agi] A thought. Brad Wyble
