Edward W. Porter wrote:
From what you say below it would appear human-level AGI would not require
recursive self improvement, because as you appear to define it human's
don't either (i.e., we currently don't artificially substantially expand
the size of our brain).

I wonder what percent of the AGI community would accept that definition? A
lot of people on this list seem to hang a lot on RSI, as they use it,
implying it is necessary for human-level AGI.

RSI is not necessary for human-level AGI.

RSI is only what happens after you get an AGI up to the human level: it could then be used [sic] to build a more intelligent version of itself, and so on up to some unknown plateau. That plateau is often referred to as "superintelligence".


Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=49396283-86f16f

Reply via email to