Re: [opencog-dev] Re: Calling forward/backward chainer

2017-04-22 Thread Vishnu Priya


Hi Nil,


Currently, as there are no working example files for FC/BC, i thought i 
could come up with some examples and contribute. 

So far, i have tested simple rule like Deduction and got that working. But 
now i wanted to try some other different rule.  

So, as per 
https://github.com/opencog/atomspace/blob/master/tests/rule-engine/BackwardChainerUTest.cxxtest#L510,
 
  where on "criminal.scm",  few rules  
(conditional-instantiation-meta-rule.scm, 
fuzzy-conjunction-introduction-rule.scm, deduction rule) have been applied 
for testing. I have also tried the same rules on same input but i does not 
get any inference. I got only empty SetLink.

I have attached the input file for your reference.

i also tested the same rules on animals.scm example but got no output. :-(

I don't know what is missing here. It would be very helpful if i get some 
help. 

Thanks in advance,
Vishnu

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to opencog+unsubscr...@googlegroups.com.
To post to this group, send email to opencog@googlegroups.com.
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/2ef709b3-8166-4707-bd9a-486c30c9323d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Criminal.scm
Description: Binary data


Re: [opencog-dev] Re: word2vec within openCog language learning?

2017-04-22 Thread Jesús López
Hi again, just wanted to drop a pair of thoughts.

What I'm talking about is more of conceptual exploration, categorical
and liguistically motivated while Ben talk is more neural and
hands-on. What would be nice is connecting the threads.

Previously Ben said:
> The semiring could also be a non-Boolean algebra of relations on
graphs or hypergraphs

That would demand to substitute the numbers in the word2vec vectors
(and Coecke tensors!) by whole relations (relations on hypergraphs are
much fatter than just numbers) which I'm not sure you'd even want. I
didn't remember seeing this before. For good or bad, last week
appeared arxiv:1704.05725 for the categorical quantum mechanics
setting where they seem to be doing just that sort of thing,
substituting the complex numbers field by an arbitrary C*-algebra. If
you can think of your algebra of relations as C-star, that would push
that idea some further, though I don't really know how far it goes
semantically, not to speak about learning parameters. One would need
also the glue to apply the former paper idea to the quantum flavor of
Cocke semantics.

Can't help on GAN stuff because of lacking homework on that. However I
would also look to what Socher did in 2013. Typical neural nets are
many-flat sandwiches of rectangles of weights (linear), that have
stacked on top a vector of nonlinearities and so on. Socher
introduced/used *tensor* neural nets where he used a *cube* for a
*bi*-linear transformation followed by nonlinearity. His units
transform pair of vectors to single vectors and his NN topology is a
binary tree (instead of a linear stacking of layers of a classical
NN). If you have a fragment of English generated by a CFG, the parse
tree (true tree) can be binarized [1], and each node would be a Socher
net unit, with leaves being distributional (word2vec) vectors.

The difference of this with Cocke is that in the later there is not
binarization (instead multilinear, general tensors), and the net is
not a tree but a DAG. And more importantly of course there are
nonlinear extra toppings of nodes in Socher and an actual learning
algorithm, thing left for the future more or less in Coecke view
despite some efforts. So basically if you put a nonlinear topping or
hat on each of the nodes of what I was calling a tensor network you
should arrive at a neural tensor net. Just split the rank r of the
tensor in r = u + v, for u the quantity of contravariant (input)
indices, and v the quantity of covariants (outputs). Then each node
tensor has u *vectors* as inputs (2 in Socher) and v output vectors.
One needs an analogue of the element-wise nonlinearity in this context
but I don't know which. As the topology can include "diamond" paths,
one needs a suited learning method. I've read about what's called
backpropagation through structure in tensor neural net papers.

Another technical difference is that Socher had an extra additive
contribution to the output of their bilinearly-flavored units by an
extra classical NN-stage, just not to lie.

All the former if one has serious interest in the Cocke approach to semantics.

Note that while Coecke theory is very pleasant categorically, the
nonlinear toppings have not received any attention from categorists
that I know of.

On the purely categorical side of understanding this same problem, and
forgetting parameter learning for a moment, I had a litte realization
to share. I talked about categories resulting of several monads as
*targets* of Coecke semantic functor. Later I remembered that the
source has also monad flavour. Sequences of things can be understood
through the list monad from the viewpoint of functional programming,
or the free monoid monad of the purists. One can thus see sentences as
sequences of words (lexical entities) given by a specific monad. Thus
we have monad flavour in both source and target of the semantics
functor. That prompts questions on the character of the functor
itself.

That thoughts put me in the functional programmer mindset and I
remembered an old reading by Wadler, he was talking of understanding
(in functional programming and using Moggi ideas on computing with
monads) recursive descent parsers of domain specific languages given
by a context free grammar by monadic means. The topic is called
monadic parsing. For developers. Interestingly this viewpoint is
permeating into Linguistics as well, as demonstrated by "Monads for
natural language semantics" (Shan). He talks of semantics as a monad
transformer. We are at a point where there even is a section called
"The CCG monad" in book of isbn 9783110251708.

I don't know of work reconciling the monadic viewpoint with Coecke
stuff, but it is intriguing.

Regards, Jesús.


[1] http://images.slideplayer.com/15/4559376/slides/slide_39.jpg




On 4/13/17, Ben Goertzel  wrote:
> OK, let me try to rephrase this more clearly...
>
> What I am thinking is --
>
> In the GAN, the generative network takes in some random noise
> variables, 

Re: [opencog-dev] Re: Best texbook (most relevant to Opencog Node and Link Types) in Knowledge representation

2017-04-22 Thread Daniel Gross
Hi Linas, 

Thank you for the example:

I think this again helps visualize my further questions:

How is this additional conceptual knowledge harvested in a way that mimics 
human thinking on one hand (i.e. it deploys context adequately, established 
relevant abstractions, and, generally, creates a structure that is 
parsimonious, conceptual (i.e. thing, not string), and supports effective 
and efficient  autonomous and goal-directed reasoning over it on the other. 

And, thinking out loud some more -- what if a lot of common sense knowledge 
is implicit and not observable. We can observe what people do but its much 
harder to know why they do it and the connective 
(socio-psychological-cultural and value-laden (a now favorite word of mine) 
) tissue (personal experiences) that holds it all together and gives it 
explanatory meaning. 

thank you,

Daniel 


On Friday, 21 April 2017 18:50:33 UTC+3, linas wrote:
>
>
> On Fri, Apr 21, 2017 at 10:12 AM, Daniel Gross  > wrote:
>
>> In context of A one morphism may hold, in context B another -- and you 
>> indicated two kinds of contexts, ) domains (swimming, rowing) and 
>> human-introspective-valueladen interpretive context. 
>
>
>
> To return to Alex's original question, there was a question of how to 
> represent knowledge in a computer.   So, for opencog, a very miniscule 
> subset of the knowledge graph might be:
>
> ContextLink
>  ConceptNode "swimming"
>  EvalutaionLink
>PredicateNode "catch"
>PhysicalMotorMovementLink
>  PositionLink...
>  VelocityLink
>
> that's the general idea. The above is actually a rather poor design for 
> representing that knowledge: instead of position and velocity, it should be 
> about hand and wrist. Instead of PredicateNode "catch" it should be 
> PredicateNode "catch as taught by Mark", with additional links to Mark and 
> why his technique differs from the catch as taught by coach Ted.  So this 
> simplistic graph representation blows up out of control very rapidly.  
> Which is why it cannot be hand-authored: its why the system must 
> automatically discern and learn such structures.
>
> BTW, in opencog, any two-element link is a "morphism"
>
>SomeLink
> SomeNode "source"
> OtherNode "target"
>
> Its OK to think of that as an arrow from source to target.  But its also 
> OK to think about it as a binary tree, with "SomeLink" being the root, and 
> the two nodes being the leaves.  So there are multiple ways to diagram 
> these things.
>
> --linas
>

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to opencog+unsubscr...@googlegroups.com.
To post to this group, send email to opencog@googlegroups.com.
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/d582d660-22b4-41fb-906c-f9423869c8b7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [opencog-dev] Re: word2vec within openCog language learning?

2017-04-22 Thread Jesús López
correction: swap co/contravariant.

On 4/22/17, Jesús López  wrote:
> Hi again, just wanted to drop a pair of thoughts.
>
> What I'm talking about is more of conceptual exploration, categorical
> and liguistically motivated while Ben talk is more neural and
> hands-on. What would be nice is connecting the threads.
>
> Previously Ben said:
>> The semiring could also be a non-Boolean algebra of relations on
> graphs or hypergraphs
>
> That would demand to substitute the numbers in the word2vec vectors
> (and Coecke tensors!) by whole relations (relations on hypergraphs are
> much fatter than just numbers) which I'm not sure you'd even want. I
> didn't remember seeing this before. For good or bad, last week
> appeared arxiv:1704.05725 for the categorical quantum mechanics
> setting where they seem to be doing just that sort of thing,
> substituting the complex numbers field by an arbitrary C*-algebra. If
> you can think of your algebra of relations as C-star, that would push
> that idea some further, though I don't really know how far it goes
> semantically, not to speak about learning parameters. One would need
> also the glue to apply the former paper idea to the quantum flavor of
> Cocke semantics.
>
> Can't help on GAN stuff because of lacking homework on that. However I
> would also look to what Socher did in 2013. Typical neural nets are
> many-flat sandwiches of rectangles of weights (linear), that have
> stacked on top a vector of nonlinearities and so on. Socher
> introduced/used *tensor* neural nets where he used a *cube* for a
> *bi*-linear transformation followed by nonlinearity. His units
> transform pair of vectors to single vectors and his NN topology is a
> binary tree (instead of a linear stacking of layers of a classical
> NN). If you have a fragment of English generated by a CFG, the parse
> tree (true tree) can be binarized [1], and each node would be a Socher
> net unit, with leaves being distributional (word2vec) vectors.
>
> The difference of this with Cocke is that in the later there is not
> binarization (instead multilinear, general tensors), and the net is
> not a tree but a DAG. And more importantly of course there are
> nonlinear extra toppings of nodes in Socher and an actual learning
> algorithm, thing left for the future more or less in Coecke view
> despite some efforts. So basically if you put a nonlinear topping or
> hat on each of the nodes of what I was calling a tensor network you
> should arrive at a neural tensor net. Just split the rank r of the
> tensor in r = u + v, for u the quantity of contravariant (input)
> indices, and v the quantity of covariants (outputs). Then each node
> tensor has u *vectors* as inputs (2 in Socher) and v output vectors.
> One needs an analogue of the element-wise nonlinearity in this context
> but I don't know which. As the topology can include "diamond" paths,
> one needs a suited learning method. I've read about what's called
> backpropagation through structure in tensor neural net papers.
>
> Another technical difference is that Socher had an extra additive
> contribution to the output of their bilinearly-flavored units by an
> extra classical NN-stage, just not to lie.
>
> All the former if one has serious interest in the Cocke approach to
> semantics.
>
> Note that while Coecke theory is very pleasant categorically, the
> nonlinear toppings have not received any attention from categorists
> that I know of.
>
> On the purely categorical side of understanding this same problem, and
> forgetting parameter learning for a moment, I had a litte realization
> to share. I talked about categories resulting of several monads as
> *targets* of Coecke semantic functor. Later I remembered that the
> source has also monad flavour. Sequences of things can be understood
> through the list monad from the viewpoint of functional programming,
> or the free monoid monad of the purists. One can thus see sentences as
> sequences of words (lexical entities) given by a specific monad. Thus
> we have monad flavour in both source and target of the semantics
> functor. That prompts questions on the character of the functor
> itself.
>
> That thoughts put me in the functional programmer mindset and I
> remembered an old reading by Wadler, he was talking of understanding
> (in functional programming and using Moggi ideas on computing with
> monads) recursive descent parsers of domain specific languages given
> by a context free grammar by monadic means. The topic is called
> monadic parsing. For developers. Interestingly this viewpoint is
> permeating into Linguistics as well, as demonstrated by "Monads for
> natural language semantics" (Shan). He talks of semantics as a monad
> transformer. We are at a point where there even is a section called
> "The CCG monad" in book of isbn 9783110251708.
>
> I don't know of work reconciling the monadic viewpoint with Coecke
> stuff, but it is intriguing.
>
> Regards, Jesús.
>
>
> [1]