On Sun, Apr 16, 2023 at 7:51 AM Thomas Passin <[email protected]> wrote:

> Reading more of the material, and the published paper on solving the
> Raven's Progressive Matrices, I'm not convinced that the RPM situation is
> as impressive as it seems.


Hi Thomas. Thanks for your thoughtful comments.

I've mostly given up trying to understand the paper :-) In particular, I
have no idea how the NN creates the "universe" of possible answers.  Still,
the *mathematical* trick of using a big space is something to remember.

I like to think in pictures, so I paid most attention to the figures at the
end.  Here is a quote (emphasis mine)

QQQ
At the lowest level of the hierarchy, the four attribute values are
represented by *randomly *drawing four d-dimensional vectors (x[red], ...).
The vectors are *dense binary*, and arranged as d = 10 x 10 for the sake of
visual illustration.

At the next level, the red square object is described as a fixed-width
product vector by binding two corresponding vectors (x[red ] dot
x[square]) *whose
similarity is nearly zero to all attribute vectors and other possible
product vectors* such as (x[blue] dot x[triangle]), (x[red ] dot
x[triangle]), etc. as shown in (c).

This quasi-orthogonality allows the VSA representations to be co-activated
with minimal interference.

At the highest level, the two object vectors are bundled together by
similarity-preserving bundling to describe the scene. *The bundled vector
is similar solely to those objects' vectors and dissimilar to others.*

*QQQ*

This trick/tool is what I take from the paper. Everything else is
mysterious :-)

Edward

-- 
You received this message because you are subscribed to the Google Groups 
"leo-editor" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/leo-editor/CAMF8tS0zUtWmgLvxXUxOZgyHNDfjGVXx-7Ve11OD%2BCZPW7%2BK2w%40mail.gmail.com.

Reply via email to