My thinking is not too small.  Anymore than any other person on this
distribution list.  But that is not why this response.  My response is to be
able to clarify what I meant.  I'm not disagreeing - not was I trying to
sound brilliant.

I'm certainly not suggesting that I will be the one to invent it.  In fact,
ad what I was suggesting, is that I'm more likely to extend an open source
project (at some point when it shows human-level intelligence), and package
it as an expert system to solve specific domain problems (and yes - this is
still AGI - but directed to a subset of its capabilities) and sell it, to a
company with much more distribution power than I myself can create.   I
merely stated, "So, the creators of the first several AGIs will be kings for
a decent amount of time."  Even a narrowly focussed AGI as an expert system,
can be sold for billions.

I can't predict, or define, what the "real deal" is likely to be.  To me,
AGI of human-like intelligence, or even super human intelligence, does not
mean you have machines running around masquerading as humans and taking our
jobs.  That - it probably well beyond my lifetime (I'm tuning 40 this
summer).  I also am suggesting a very soft takeoff.  Singularity, if it
comes, is likely to come slowly after AGI.  I consider AGI the "true deal".
It's an all or nothing thing to create a machine that can think for itself.
If you create an AGI with 5 year old intelligence, and can get progressively
smarter, and start to make predictions based on what it learned over time,
is that not the real deal?

Ok.  If it is (and I believe it is), it's a box on my desk.  Going back to
the first businesses and bartering systems, would this box become the only
vendor?  Can it entertain people by playing a role at a theatre, or dance,
or strap on a guitar and play flamenc music that brings you to tears.  I
doubt it.  Now, let me ask you a question:  Do you believe that all AI / AGI
researchers are toiling over all this for the challenge, or purely out of
interest?  I doubt that as well.  Surely there are those elements as drivers
- BUT SO IS MONEY.  This stuff IS the maker of the next software giant.

If this is not the case, how the hell are researchers ever going to get
funding?  If there is no financial return - forget about funding.
Philanthropists (who often do not look for a purely financial return) have
better uses of their money than to fund AGI research.

You can call future currency whatever you like.  Yes, it is like to change
form - but certainly not purpose.  And Marxism, where maybe AGI or the real
deal with deflate currency, is an unlikely aftermath of the advent of AGI.

There are tons of applications for it - and for the first several groups
that create it - IF they can market it - will be kings for a decent amount
of time. No empire lives forever.

~Aki

Non-AI reseacher
Businessman





On Tue, Mar 25, 2008 at 5:24 AM, Bob Mottram <[EMAIL PROTECTED]> wrote:

> On 25/03/2008, Mark Waser <[EMAIL PROTECTED]> wrote:
> >
> >  You're thinking too small.  The AGI will distribute itself.  And money
> > is likely to be:
> >
> >    - rapidly deflated,
> >    - then replaced with a new, alternate currency that truly values
> >    talent and effort (rather than just playing with the money supply -- aka
> >    interest, commissions, inheritances, etc.)
> >    - while everyone's basic needs (most particularly water, food,
> >    shelter, energy, education, and health care) are provided for free
> >
> > So your brilliant arbitrage to become rich is unlikely to be of much
> > value just a few years later.
> >
>
>
> The arrival of smarter than human intelligence will bring about changes
> which are hard to anticipate, and somehow I doubt that this will mean that
> we all live in some kind of utopia.  The only historical precedent which I
> can think of is the emergence of homo sapiens and the effects which that had
> upon other human species living at the time.  This must have been quite a
> revolution, because the new species was able to manufacture many different
> types of tools and therefore survive in environments which were previously
> inaccessible, or perform more efficiently within existing ones.
>
> There may be a period where proto-AGIs are available and companies can use
> these as "get rich quick" schemes of various kinds to radically automate
> processes and jobs which were previously performed manually.  But once the
> real deal arrives then even the captains of industry are themselves likely
> to be overthrown.  Ultimately evolutionary forces will decide what happens,
> as has always been the case.
>
>
>  ------------------------------
>   *agi* | Archives <http://www.listbox.com/member/archive/303/=now>
> <http://www.listbox.com/member/archive/rss/303/> | 
> Modify<http://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-- 
Aki R. Iskandar
[EMAIL PROTECTED]

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=98558129-0bdb63
Powered by Listbox: http://www.listbox.com

Reply via email to