Thanks Mike!
I just updated my introduction, it's even more abstract than Brett's :)

http://www.cognitivealgorithm.info/
Intelligence is a general cognitive ability, ultimately the ability to
predict. That includes planning, which technically is a self-prediction.
Any prediction is interactive projection of known patterns, hence the first
step must be pattern discovery (AKA unsupervised
learning, but all such negation-first terms are obfuscating). My definitions 
are not terribly radical, pattern recognition is a core of any
IQ test. But there is no conceptually consistent
bottom-up implementation, so
I had to design the process from the scratch.    


 For excellent
popular introductions to cognition-as-prediction perspective see “On 
Intelligence” <http://www.onintelligence.org/> by Jeff Hawkins and “How to 
Create a Mind“ 
<http://www.amazon.com/How-Create-Mind-Thought-Revealed/dp/0670025291/ref=cm_cr_pr_product_top>
 by Ray Kurzweil. But on a
technical level, they and
most everyone use neural nets, which work in a very coarse statistical
fashion. I think the best way to conceptualize basic NN: multi-layer
perceptron 
<https://towardsdatascience.com/what-the-hell-is-perceptron-626217814f53>, is 
as fuzzy centroid-based
clustering 
<https://en.wikipedia.org/wiki/Cluster_analysis#Centroid-based_clustering>. 
Each node weighs the inputs, then sums and thresholds them into
output. This normalized sum of inputs is their centroid.


Top-layer output is compared to some template, and
resulting error is backpropagated to adjust the weights. Which is a soft
clustering: weighting is modulated inclusion / exclusion of subsequent inputs
into the output. But weighted summation randomly degrades input resolution,
thus degrading the whole subsequent comparison and training process. This
degradation is exponential with the number of layers, which leads to largely
brute-force fitting through some ridiculous number of backprop cycles. 


 An alternative is connectivity-based
clustering 
<https://en.wikipedia.org/wiki/Cluster_analysis#Connectivity-based_clustering_(hierarchical_clustering)>,
 where the first step is input cross-comparison at original
resolution. Most modern methods combine such lateral cross-correlation with
vertical training. CNN is edge-detection at the bottom, same as cross-comp
within kernels, but with trained weights on compared nodes. Attention heads in 
transformers 
<https://www.quantamagazine.org/researchers-glimpse-how-ai-gets-so-good-at-language-processing-20220414/>
 and edges in Graph NNs are also initiated as lateral correlations, later 
weighed
by vertical training. And similar positional encoding was explored in Hinton's 
Capsule
Networks 
<https://medium.com/ai%C2%B3-theory-practice-business/understanding-hintons-capsule-networks-part-i-intuition-b4b559d1159b>.


But this combination is not consistent with scalable
generality, where incrementally higher levels are formed recursively. If 
connectivity
clustering is superior to vertical training at any point, then it is superior
on all levels, they should only differ in the depth of recursion that generates
them.


 I propose encoding connectivity clusters with a unique set
of parameters derived from cross-comp. These derivatives include match as a
measure of compression, AKA predictive value: a common fitness function across
the system. This encoding is very complex upfront, no way it could have evolved
naturally. The complexity is deeply structured and utterly decontextualized, 
which
is probably why no one seems to work on such methods. But they don't need
interminable opaque training, my feedback only adjusts hyperparameters.


In the next section,
I define atomic comparison and resulting patterns, then describe hierarchically 
recursive algorithm of
search for incrementally more complex patterns. The following sections
compare my scheme to ANN, BNN, and CapsNet. This is an open project: CogAlg 
<https://github.com/boris-kz/CogAlg/wiki>, we need help with design and 
implementation in
Python. I have awards for contributions, or monthly payment if there is a track
record, see the last part.



http://www.cognitivealgorithm.info/

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T8366cc740ec68376-M9309d27b3ad5dd3fba150706
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to