Ben, do you feel that these changes to OpenCog will address the current
obstacles to AGI? What do you believe were the reasons why it did not meet
the goals of the 2011 timeline (
https://www.nextbigfuture.com/2011/03/opencog-artificial-general-intelligence.html
) which forecast "full-on AGI" IN 2019-21 and recursive self improvement in
2021-23. Obviously "rebuilding a lot of OpenCog from scratch" doesn't bode
well.

If I recall in 2011, OpenCog consisted of an evolutionary learner (MOSES),
a neural vision model (DeSTIN), a rule based language model (RelEX,
NatGen), and Atomspace, which was supposed to integrate it all together but
never did except for some of the language part. Distributed Atomspace also
ran into severe scaling problems.

I assume the design changes address these problems, but what about other
obstacles? MOSES and DeSTIN never advanced beyond toy problems because of
computational limits, but perhaps they could be distributed. After all,
real human vision is around 10^15 synapse operations per second [1], and
real evolution is 10^29 DNA copy OPS [2]. Do the design changes help with
scaling to parallel computing?

I never did understand why OpenCog went with rule based language modeling
after it's long history of failure. Problems like ambiguity, brittleness,
and most importantly, the lack of a learning algorithm, have only been
solved in practice with enormous neural/statistical models. NNTP, the new
leader on the large text benchmark (
http://mattmahoney.net/dc/text.html#1123 ) takes 6 days to compress 1 GB of
text on a GPU with 10,496 Cuda cores. It runs a Transformer algorithm, a
neural network with an attention mechanism.

Actual working NLP like Google, Alexa, and Siri, seem to require the
backing of companies with trillion dollar market caps. (Alphabet $1.36T,
Amazon $1.55T, Apple $2.04T). GPT-3 is still experimental, and isn't cheap
either.

So I'm wondering if you have a new timeline, or have you adjusted your
goals and how you plan to achieve them?

1. The brain has 6 x 10^14 synapses. The visual cortex is 40% of the brain.
I assume an operation takes 10 to 100 ms.

2. There are 5 x 10^36 DNA bases in the biosphere at 2 bits each. I assume
the replication rate is the same as the atmospheric carbon cycle, 5 years.
If you include RNA and amino acid operations, the rate is 10^31 OPS.


On Wed, Feb 24, 2021, 2:24 PM Ben Goertzel <[email protected]> wrote:

> Well we are rebuilding a lot of OpenCog from scratch in the Hyperon
> initiative...
>
> One of the design goals is to embed as many of the needed
> cognitive-algorithm-related abstractions as possible in the Atomese 2
> language, so that the cognitive algos themselves become brief simple
> Atomese scripts
>
> The theory in this paper is mostly oriented toward figuring out what
> abstractions are most critical to embed in the Atomese2 interpreter in
> ways that are both easy-to-use for the developer and highly efficient
> (in concurrent and distributed processing scenarios)
>
> Current OpenCog architecture has all the cognitive algos using
> Atomspace, and many using the Pattern Matcher and URE Unified Rule
> Engine, but other than that the algos are using separate code yeah.
> Hyperon architecture aims to factor out more of the commonalities btw
> the different cognitive algos, and it seems that baking probabilistic
> dependent types and metagraph folds/unfolds into the Atomese2 language
> can be a big step in this direction...
>
> ben
>
> On Wed, Feb 24, 2021 at 10:08 AM Mike Archbold <[email protected]>
> wrote:
> >
> > In OpenCog the code is kind of compartmentalized -- disparate
> > algorithms in isolation called as necessary. That has been my
> > impression at least. But I think in this proposed architecture an
> > integration is attempted, which makes sense.
> >
> > On 2/24/21, Ben Goertzel <[email protected]> wrote:
> > > "Patterns of Cognition: Cognitive Algorithms as Galois Connections
> > > Fulfilled by Chronomorphisms On Probabilistically Typed Metagraphs"
> > >
> > > https://arxiv.org/abs/2102.10581
> > >
> > > New draft paper that puts various OpenCog cognitive algorithms in a
> > > common mathematical framework, and connects them with implementation
> > > strategies involving chronomorphisms on metagraphs...
> > >
> > > ****
> > > It is argued that a broad class of AGI-relevant algorithms can be
> > > expressed in a common formal framework, via specifying Galois
> > > connections linking search and optimization processes on directed
> > > metagraphs whose edge targets are labeled with probabilistic dependent
> > > types, and then showing these connections are fulfilled by processes
> > > involving metagraph chronomorphisms. Examples are drawn from the core
> > > cognitive algorithms used in the OpenCog AGI framework: Probabilistic
> > > logical inference, evolutionary program learning, pattern mining,
> > > agglomerative clustering, pattern mining and nonlinear-dynamical
> > > attention allocation.
> > >
> > > The analysis presented involves representing these cognitive
> > > algorithms as recursive discrete decision processes involving
> > > optimizing functions defined over metagraphs, in which the key
> > > decisions involve sampling from probability distributions over
> > > metagraphs and enacting sets of combinatory operations on selected
> > > sub-metagraphs. The mutual associativity of the combinatory operations
> > > involved in a cognitive process is shown to often play a key role in
> > > enabling the decomposition of the process into folding and unfolding
> > > operations; a conclusion that has some practical implications for the
> > > particulars of cognitive processes, e.g. militating toward use of
> > > reversible logic and reversible program execution. It is also observed
> > > that where this mutual associativity holds, there is an alignment
> > > between the hierarchy of subgoals used in recursive decision process
> > > execution and a hierarchy of subpatterns definable in terms of formal
> > > pattern theory.
> > > ****
> > >
> > > --
> > > Ben Goertzel, PhD
> > > http://goertzel.org
> > >
> > > “He not busy being born is busy dying" -- Bob Dylan
> 
> 
> --
> Ben Goertzel, PhD
> http://goertzel.org
> 
> “He not busy being born is busy dying" -- Bob Dylan

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-Mc91706bfbcde4a564f4797f2
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to