If you want to argue this way (reasonable), then you need a specific
definition of intelligence. One that allows it to be accurately
measured (and not just "in principle"). IQ definitely won't serve.
Neither will G. Neither will GPA (if you're discussing a student).
Because of this, while I think your argument is generally reasonable, I
don't thing it's useful. Most of what you are discussing is "task
specific", and as such I'm not sure that intelligence is a reasonable
term to use. An expert engineer might be, e.g., a lousy bridge player.
Yet both are thought of as requiring intelligence. I would assert that
in both cases a lot of what's being measured is task specific
processing, i.e., narrow AI.
(Of course, I also believe that an AGI is impossible in the true sense
of general, and that an approximately AGI will largely act as a
coordinator between a bunch of narrow AI pieces of varying generality.
This seems to be a distinctly minority view.)
Terren Suydam wrote:
Hi Will,
I think humans provide ample evidence that intelligence is not necessarily correlated
with processing power. The genius engineer in my example solves a given problem with
*much less* overall processing than the ordinary engineer, so in this case intelligence
is correlated with some measure of "cognitive efficiency" (which I will leave
undefined). Likewise, a grandmaster chess player looks at a given position and can
calculate a better move in one second than you or me could come up with if we studied the
board for an hour. Grandmasters often do publicity events where they play dozens of
people simultaneously, spending just a few seconds on each board, and winning most of the
games.
Of course, you were referring to intelligence "above a certain level", but if
that level is high above human intelligence, there isn't much we can assume about that
since it is by definition unknowable by humans.
Terren
--- On Tue, 10/14/08, William Pearson <[EMAIL PROTECTED]> wrote:
The relationship between processing power and results is
not
necessarily linear or even positively correlated. And as
an increase
in intelligence above a certain level requires increased
processing
power (or perhaps not? anyone disagree?).
When the cost of adding more computational power, outweighs
the amount
of money or energy that you acquire from adding the power,
there is
not much point adding the computational power. Apart from
if you are
in competition with other agents, that can out smart you.
Some of the
traditional views of RSI neglects this and thinks that
increased
intelligence is always a useful thing. It is not very
There is a reason why lots of the planets biomass has
stayed as
bacteria. It does perfectly well like that. It survives.
Too much processing power is a bad thing, it means less for
self-preservation and affecting the world. Balancing them
is a tricky
proposition indeed.
Will Pearson
-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com