Yeah, I know symbolic dynamics pretty well. I think I wrote most of the
wikipedia article on "subshifts of finite type" and the rainbow of related
topics - the product topology, the cylinder sets, e.g. most of
"measure-preserving dynamical system" There's a vast network of related
topics, and
Hi Enzo,
On Mon, Jun 19, 2017 at 3:49 PM, Enzo Fenoglio (efenogli) <
efeno...@cisco.com> wrote:
>
>
> A “sigmoid-thresholded eigenvector classifier” is just a single layer
> autoencoder with sigmoid activation. That’s equivalent to performing PCA as
> you did. But if you had used a stacked
On Mon, Jun 19, 2017 at 3:26 PM, Hugo Latapie (hlatapie) wrote:
> Thanks Linus. The approach here does look extremely promising.
>
>
>
> Bridging the gap between these various camps is the holy grail that few
> are even searching for much less attempting to implement.
>
On Mon, Jun 19, 2017 at 2:11 PM, Hugo Latapie (hlatapie) wrote:
> *arXiv:1702.00764*
I've just barely started reading that, and from the very beginning, its
eminently clear how even the latest, leading research on deep neural nets
is profoundly ignorant of grammar and
Again, there's a misunderstanding here. Yes, PCA is not composable, sheaves
are. i'm using sheaves. The reason that I looked at PCA was to use a
thresholded, sparse PCA for CLUSTERING. and NOT similarity where
compositionality does not matter. Its really a completely different
concept, quite
OK, well, some quick comments:
-- sparsity is a good thing, not a bad thing. It's one of the big
indicators that we're on the right track: instead of seeing that everything
is like everything else, we're seeing that only one of of every 2^15 or
2^16 possibilities are actually being observed! So
On Tue, Jun 20, 2017 at 12:07 AM, Linas Vepstas wrote:
> So again, this is not where the action is. What we need is accurate,
> high-performance, non-ad-hoc clustering. I guess I'm ready to accept
> agglomerative clustering, if there's nothing else that's simpler,
Hi Ben,
On Mon, Jun 19, 2017 at 9:01 AM, Ben Goertzel wrote:
> Hi Linas,
>
> I have read the report now...
>
> Looking at the cosine similarity results, it seems clear the corpus
> you're using is way too small for the purpose (there's no good reason
> "He" and "There" should
;>
>>
>>
>>
>>
>>
>> From: Linas Vepstas [mailto:linasveps...@gmail.com]
>> Sent: lundi 19 juin 2017 11:16
>> To: opencog <opencog@googlegroups.com>; Curtis M. Faith
>> <curtis.m.fa...@gmail.com>
>> Cc: Ruiting Lian <ruit...@han
On Mon, Jun 19, 2017 at 3:31 AM, Ben Goertzel wrote:
>
> Regarding "hidden multivariate logistic regression", as you hint at
> the end of your document ... it seems you are gradually inching toward
> my suggestion of using neural nets here...
>
Maybe. I want to understand the
Interesting! I will read it through tomorrow, the rest of my today
seems eaten by other stuff...
I am not surprised that PCA stinks as a classifier...
Regarding "hidden multivariate logistic regression", as you hint at
the end of your document ... it seems you are gradually inching toward
my
11 matches
Mail list logo