Given that no "unit" can determine how long it will survive, it makes
sense to have a "point in time" measure for intelligence. I'm thinking
that intelligence is in the "eye of the assessor."
It is possible for an assessor to look at the thermostat and say -
- it is simple
- it is not very informed
- doesn't do any complicated prediction or modelling
etc.
This means that a thermostat is of low intelligence.
A sophisticated assessor could be familiar with algorithms used in
prediction, and rules used for assertions, and the overall set of rules
and values being "held" by the unit. By study and accounting of the
features of the "unit," the assessor could give an educated guess on the
level of intelligence. Thus assigning the "high" intelligence attribute
to the unit if it has the right mechanisms.
I am familiar with the concept of maximizing benefit as a proof of
higher intelligence, but I suspect that in practice one will look at the
pieces and make a call about the intelligence of the unit.
And, the good assessor might notice the vulnerability to stupidity.
about goals...
the ability to achieve goals I would consider a "simple" form of
intelligence. Seems harder to compare goals and select the one that is
the better opportunity. To find and recognize opportunity takes a
sophisticated mechanism. This mechanism in my view is closer to the
capability we call intelligence.
I took a quick look at the paper and don't think I'm "enabled" to
appreciate it yet. :)
stan
On 11/12/2014 08:56 PM, Ben Goertzel via AGI wrote:
According to my mathematical formalization of the intelligence
concept, intelligence is defined independently of any specific goal,
so that when an organism intelligently seeks self-desctruction or
other self-defeating goals, it is still being intelligent relative to
those goals. However, it may have less TOTAL intelligence over time
than a comparable system with intelligence relative to
self-perpetuating or self-actualizing goals, because it will survive
less long ;p
http://agi-conf.org/2010/wp-content/uploads/2009/06/paper_14.pdf
-- Ben G
On Wed, Nov 12, 2014 at 10:55 PM, Stanley Nilsen <[email protected]> wrote:
Just for the fun of it...
What if,
Intelligence is an attribute of a "unit" that correlates with an assessment
of the units mechanism being used in the process of generating "higher
quality" choices.
(one could also say that stupidity is an attribute assessing the mechanisms
that generate poor choices.)
As such, a unit would both have intelligence and stupidity.
(Example: an ordinarily intelligent person in love or anger...)
On 11/12/2014 07:37 PM, Ben Goertzel via AGI wrote:
And, the answer to that, I think from your perspective, is that we are
still looking for as yet undiscovered equations of intelligence, which
would not wholesale copy 3 dimensional math, but would be something
new, like a 0-dimensional math (which I've read about but don't
understand).
Yes, I think that in future there will be nice mathematical equations for
the structure and dynamics of intelligence systems...
These will not solve all problems about intelligence immediately, of
course,
just as knowing the Navier-Stokes equation doesn't make the whole of fluid
dynamics trivial ... it just gives a solid basis for fluid dynamics
work...
However, just as the Wright Brothers built a plane without a good
aerodynamic
theory, it may also be possible to build an AGI prior to the existence
of a solid
mathematical theory of AGI...
I don't think it's useful to describe, say, the math of computer
algorithms as
"zero dimensional." It's simply math about structures for which
dimensionality
is not a relevant concept....
-- Ben
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/9320387-ea529a81
Modify Your Subscription:
https://www.listbox.com/member/?&
Powered by Listbox: http://www.listbox.com
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com