Reading this I can't help but think about Homomorphic encryption, which
provides the ability to perform computation on encrypted data.  If you were
to use a compression algorithm as your encryption method then you would be
done.  Unfortunately homomorphic encryption necessitates bitwise encryption
and compression algorithms would most certainly take into account more than
a single bit when they perform their compression.  And so there would be
quite a bit of work to do to generalize this field of mathematics to work
on that class of encryption methods, but it seems to be where I would focus
my efforts if I wished to provide this capability.

But you seem to have dismissed homomorphic encryption for some reason when
you said the following.

"I did not want to use binary arithmetic as an example because computers
were designed around those principles."


Homomorphic encryption most certainly uses binary arithmetic.  Can you
elaborate on why you wish to preclude this?


Memories in the brain are reconstructive, and confabulatory.  Which is to
say when you ask someone to recall something, they will not recall
information as it was, but rather as their brain is.  And one can alter the
brain of others to effectively perform CRUD operations on their memories,
allowing you to alter those memories to any extent you wish.  Such seems to
be a rather big problem for memory systems which are brain inspired.
Computers have perfect recall and such is highly desirable and a great
improvement over humans.  I'm not certain why you would wish to replace
such a perfect system with a lossy (and confabulatory) system.


On Sat, Jun 10, 2017 at 4:02 AM, Jim Bromer <[email protected]> wrote:

> Rob,
> I will look at the paper when I get a chance.
>
> Jim Bromer
>
> On Wed, Jun 7, 2017 at 7:16 PM, Rob Freeman <[email protected]>
> wrote:
>
>> Jim,
>>
>> Have a look at this paper and see if you find it relevant. I understand
>> it to be a sketch for logic using distributed representation. RNN's still
>> globally optimize, so I think they will still have lossy compression
>> (instead of partial compression?) But the idea of using distributed
>> representation is on the right track:
>>
>> Semantic Compositionality through Recursive Matrix-Vector Spaces
>> Richard Socher   Brody Huval   Christopher D. Manning    Andrew Y. Ng
>> https://nlp.stanford.edu/pubs/SocherHuvalManningNg_EMNLP2012.pdf
>>
>> -Rob
>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b5> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>
> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/26973278-698fd9ee> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to