Hi All,
It's been more than a couple of decades now that I've believed AGI is
just around the corner. I've begun to explore the 'negative'. Here's
some of my thought experiment, a bit of fun ; )
1) We don't know what General Intelligence really 'is'. The definition
of intelligence is a blurry concept. Are the definitions that we have
now as accurate as what Dalton had for the atom? How much detail is
required before it can be built?
2) Even Human Intelligence could be defined as 'Narrow'. Can any random
human baby be trained to be both a concert pianist and a theoretical
physicist? Perhaps narrow A.I. will gradually expand to cover our needs,
without ever needing to become 'general'. There was a profession of
'human computer' which I guess is one of the first to be replaced by
very narrow 'intelligence'.
3) A 'soul' is required before intelligence can be obtained. Those
without religious beliefs, feel free to replace 'soul' with
'unknown/uncompromisable part' (perhaps parallel quantum reactions with
other dimensions). Consider a chimpanzee with the belief that it can
build a boat, except all its efforts are put towards shaping clay to
look like a water bird.
4) Human genetics/evolution play an 'unobtainable' part in general
intelligence. A significant proportion of human behavior is set by
genetics (reference identical twin studies). The beginnings of language
have a genetic beginning ('ma' and 'da' appear as parental names across
different languages). Perhaps the 'fine tuning' performed by evolution
is not obtainable in an artificial manner?
5) Mental stability. People aren't stable. Individuals commit suicide,
groups cluster and wage wars. If this stability is an integral part of
general intelligence, perhaps it's not achievable with our level of
understanding/technology.
6) Effectiveness. Alchemists believed that they could make themselves
rich converting lead into gold. The materials were very similar, but the
techniques attempted were mostly futile (but did spin off some other
nice discoveries). It is possible to perform this transmutation, but the
process is not economically viable. By the time we have the
understanding to create general intelligence, we'll have other
technology (maybe brain enhancement implants) that are far more effective?
What are your 'beliefs'?
:Brett.
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com