A bunch of narrow AIs working together can be a valuable resource for an AGI. They can even maybe be an integral subsystem of an AGI. But an AGI will still need some component that is fundamentally concerned with abstraction and generalization, rather than with carrying out any particular narrow function.
-- Ben On Wed, Jul 17, 2019 at 11:50 AM Matt Mahoney <[email protected]> wrote: > > > > On Mon, Jul 15, 2019, 10:13 AM <[email protected]> wrote: >> >> https://towardsdatascience.com/no-you-cant-get-from-narrow-ai-to-agi-eedc70e36e50 >> >> >> >> https://medium.com/intuitionmachine/from-narrow-to-general-ai-e21b568155b9 > > I agree with you that narrow AI won't evolve into AGI. That's why I proposed > building lots of distributed narrow AI. To make this work, agents also have > to know which other agents to refer questions to. If we make reasonable > assumptions about the ontological structure of knowledge, then a distributed > representation of this meta knowledge scales at O(N log N). > > Cognitive models seem like the obvious solution to AGI, but a lot of people > have tried this approach and their failure needs an explanation. Opencog uses > a cognitive model and still has no knowledge base or applications after 20 > years. > > Legg proved that powerful predictors are necessarily complex, which is why I > proposed specialization instead. We keep looking for that neat solution that > doesn't exist. It's not a hard proof. Suppose you have a simple universal > learner, that inputs any computable sequence of bits and learns to predict > them with less than 100% error rate. Then I can produce a simple, predictable > sequence you cannot predict. My program runs a copy of your program and > outputs the opposite bit. > > The brain is not just a few neural network modules. It has thousands of > specialized structures, hundreds of types of neurons, and hundreds of > neurotransmitters. We are born knowing to fear heights and spiders, how to > swallow and cough and blink, how to learn language, and how to distinguish > what is good to eat from among thousands of scents. Altogether we are born > knowing half of what we know as adults, 10^9 bits encoded in our DNA and 10^9 > bits of long term memory for words, pictures, and sounds. There is no good > way to avoid coding the inherited knowledge because evolution took 10^48 base > copy operations to write it. The code is 300M lines. > > Google has 100,000 employees so they can write this much code. They have > millions of CPUs and billions of users so they can collect the learnable part > of the knowledge. My proposed solution isn't any cheaper. But good luck if > you think you can do better. > > Artificial General Intelligence List / AGI / see discussions + participants + > delivery options Permalink -- Ben Goertzel, PhD http://goertzel.org “The only people for me are the mad ones, the ones who are mad to live, mad to talk, mad to be saved, desirous of everything at the same time, the ones who never yawn or say a commonplace thing, but burn, burn, burn like fabulous yellow roman candles exploding like spiders across the stars.” -- Jack Kerouac ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tbee4f4d703cc71ad-Me57b6052cb6ea2a8f10d47cd Delivery options: https://agi.topicbox.com/groups/agi/subscription
