[This message has been crossposted from the AGI list. Apologies for duplication]

Some of the recent discussion has become tangled partly as a result of
different understandings of what the 'Singularity' is and what the
relationship might be between our own minds and hypothetical future
minds.  (Or 'Minds', to use the Iain M. Banks nomenclature).

'Singularity'

When I use that word, I mean a perfectly comprehensible situation in
which we build computer systems that can discover new science and new
technology at speeds that exceed some significant multiple of the speed
at which humans discover those things -- and just for the sake of
argument I usually adopt a 1000x threshold as being both attainable and
radically different from the situation today.  (It is assumed that these
machines will actually do the production of the new science and
technology, of course, rather than be capable of doing it but unwilling
to do so.  That raises other issues, but as far as I am concerned the
concept of a Singularity is about that situation where they both can and
do start generating new knowledge at that rate).

In other words, when we get to the point where we get the next thousand
years of knowledge in one year, that is my concept of the Singularity.

There is another concept of the Singularity that involves something like
"when the curves go off to infinity and everything becomes completely
unknowable".

This concept strikes me as outrageously speculative.  First, I don't
have any reason to believe that those curves really will go off to
infinity (there could be limits).  Second, I don't necessarily believe
that the results of the first phase (the type of Singularity I defined
above) will automatically lead to the creation of quasi-infinite minds,
or completely incomprehensible minds, or a completely unpredictable,
incomprehensible world.  All that stuff is wild speculation compared
with the modest reading of the Singularity I gave above.

My definition of the Singularity is still capable of bringing a wildly
different future, just not the kind of open-ended craziness that some
people speculate about.

Which brings me to this:

'Mind'

I don't want to produce a comprehensive definition of 'mind', but only
make a point about the way the word is being used right now.

When people talk about future minds possibly being incomprehensible to
'us', I find this talk peculiar.

What makes people think that there will ever be a situation when there
will  be two separate communities, one of them being 'Minds' (in the
IMB/Culture sense), and the other being us 'minds'?

If the mild Singularity I described above is what actually happens, then
I would expect a situation in which our own minds have the option of
shuttling back and forth between our present level of intelligence and
the level of the smartest machines around.  I mean that literally: I
foresee a point when we could shift up and down as easily as we (or our
synchromesh transmission systems) shift gears today.

Just as I like to change phone every so often to make sure I have the
coolest, fastest one available, so I see a point when it would be
inconceivable that those minds that started out as biology should
somehow feel obliged to stay that way, as a separate species that could
not 'understand' the highest level Minds available at that time.

Why would I 'expect' this situation? That has to do with the way that the Minds would behave, which has to do with their motivational systems. Long discussion there, but the bottom line is that it is quite possible (and I believe extremely likely) that they would behave in such a way as to encourage a situation where human minds were freely upgradeable all the time.

Me, personally, I would not necessarily want to stay in the superintelligent state all the time, but some of the time I certainly would.

But in that context, it makes no sense to ask whether there would be
minds so advanced that 'we' could never understand them.

Or, to be precise, it is not at all obvious that such a situation will ever exist.



Richard Loosemore.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07

Reply via email to