Here is a paper titled "A Grounded Deep Symbolic Neural Network for Perception"
that I am planning to submit to the NeSy 2022 workshop.
It is one of a number of workshops in the Second International Conference on 
Learning and
Reasoning (IJCLR 2022) in Windsor, UK on 28th – 30th September. Papers are due 
by May
31st. I would appreciate any constructive feedback you would like  to make.

https://www.adaptroninc.com/sites/default/files/inline-files/Grounded_Deep_Symbolic_Neural_Network_for_comments.pdf

Abstract
Both animals and artificial intelligent agents rely upon the identification of 
types of objects and events during perception. It is a categorization process, 
which senses, recognizes and encodes invariant features. Binary neurons 
(binons) are general-purpose artificial neural nodes for representing 
properties, objects, events and relationships between them. Non-symbolic binons 
are used in short-term memory to represent core sensory properties such as 
position, intensity and time and ones derived from them. Ratios derived from 
these properties are converted into invariant symbolic categories using a novel 
discretization algorithm based on the Weber-Fechner psychophysical laws. 
Symbolic binons are combined to form deep hierarchical neural networks that 
comprise long-term memory. It contains spatial and temporal binons representing 
the shape and contrast patterns for categories of objects and events. They are 
grounded on the core and derived properties. Empirical evidence of their 
successful use in classifying handwritten digits was provided by Martensen in 
2013[1]. The neural networks are 100% symbolic, transparent, compositional, 
scalable and sparse. Learning is continuous and unsupervised.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T842d98b5a3b0de4d-M15de4154a8d9d55f0d73a200
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to