On 01/11/2007, Jiri Jelinek <[EMAIL PROTECTED]> wrote:
> a) how important AGI is

If you're main focus is the short term - say the next five to ten
years - then AGI is probably not very important and you can get by
largely with the technologies which are currently available (or
faster/cheaper versions of them).

If you're interested in the longer term on a scale of decades or
centuries then AGI is going to be extremely important, bringing about
substantial changes.


> b) how many dev teams seriously work on AGI

Very few.


> c) how many investors are willing to spend good money on AGI R&D

I don't spend a lot of time with investors, but my industry experience
suggests that this number is also very few.  Investors may be very
willing to spend good money on AI projects - specialised systems which
have a specific end product - but that's not really what AGI is about.


> I believe AGI does need promoting. And it's IMO similar with the
> immortality research some of the Novamente folks are involved in. It's
> just unbelievable how much money (and other resources) are being used
> for all kinds of nonsense/insignificant projects worldwide.

Oh yes there's a high noise to signal ratio.


> "cannot predict" - I agree.
> "cannot control" - I disagree. Controlling goals, subgoals, and the
> real world impact (possibly using independent narrow AI tools) will do
> the trick.

Prediction and control are two sides of the same coin.  If you can to
some extent predict the behavior of a system you stand some chance of
influencing it.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=59816653-8ef1ee

Reply via email to