On Fri, Nov 12, 2021 at 9:07 PM Douglas Miles <[email protected]> wrote:

>
>
>
>>
>>
>>
>>> Le mar. 9 nov. 2021 à 18:07, Linas Vepstas <[email protected]> a
>>> écrit :
>>>
>>>
>> OK, there are two or five distinct conversations, ideas I am trying to
>> have.
>>
>> 2) I did add an s-expression module to the existing AtomSpace. It took
>> about an afternoon of coding. It works -- you can store arbitrary
>> s-expressions as abstract syntax trees in the atomspace. You can then apply
>> all the other AtomSpace machinery to work with this. There's a demo online.
>>
>
> Awesome.. would you point out the demo link?
>

It's here:


https://github.com/opencog/atomspace/blob/master/examples/foreign/sexpr-query.scm

Caution: its "raw Atomese" -- if you ever wanted to use this, you would
want to write conveience wrappers around it to make it easier to use.


>
>>
>> 3) Just like the above, I was going to add  a JSON module to the existing
>> AtomSpace, but I got bored before I finished it. It would take about a day
>> to add. It would allow you to store arbitrary JSON in the AtomSpace, and
>> then  apply all the other AtomSpace machinery to work with this (this being
>> the name-tagged abstract syntax trees, which is what JSON is.)  With this,
>> one could have a competitor to systems such as grakn.ai .. but so what?
>> No existing grakn.ai customer will ever switch to the AtomSpace.  And
>> the likelihood of new users seems infinitesimal. So why bother?
>>
>>>
>>>
>
> Is AtomSpace ever (or always?) used to supply a connection between the
> OpenCog projects as a live KR system?  That is it being a place that could
> hold their configurations and bootstrap data and share their deductions for
> runtime? (Such as the world state of simulation.)
>

Since the dawn of time. That is it's reason for being.

It was created in 2001 or 2002 in Brazil, for this purpose, with a mad
scramble at that time w.r.t the tamaguchi craze. As I recall, Disney paid
$250M for some animated avatar penguin or something. There are some opencog
youtube videos from circa 2009  that show an animated dog that you can talk
to. https://www.youtube.com/watch?v=FEmpGRLwbqE  and
https://www.youtube.com/watch?v=vZtnjKcrdZQ and
https://www.youtube.com/watch?v=ii-qdubNsx0  Those all use atomspace under
the covers. You can even dig out the code for that out of github.  It's all
there.

Since then... heck. It was used in some of the Hanson Robotics Sophia
demos.   It was used to datamine genes encoding longevity. I think you were
the person who typed in "the cat sat on the mat" to the IRC chatbot circa
2011 or 2012 -- that ran on the atomspace. The atomspace is the part that
"remembers" the earlier conversation, and provides all the background facts.

I'm doing language learning on it now -- millions of atoms in it.  Circa
2018, Ben's russsian's hooked up tensorflow to it.  Everything that the URE
and PLN do is on the atomspace. The agi-bio code, where the genetics work
gets done, is all on the atomspace -- millions of genes and proteins.

So, yes, its the core component. I don't think that there is anything at
all in the opencog code base that does not use it directly.

--linas

>
> --
Patrick: Are they laughing at us?
Sponge Bob: No, Patrick, they are laughing next to us.

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/CAHrUA354-GDLA3zdUiodM9Zxy4Fbo8bC6tP2gZY-uZhSiZZP9g%40mail.gmail.com.

Reply via email to