Ed,
Get a grip. Try to write with complete words in complete sentences
(unless discreted means a combination of excreted and discredited -- which
works for me :-).
I'm not coming back for a second swing. I'm still pursuing the first
one. You just aren't oriented well enough to realize it.
Now you are implicitly attacking me for implying it is new to think you
could deal with vectors in some sort of compressed representation.
Nope. First of all, compressed representation is *absolutely* the wrong
term for what you're looking for.
Second, I actually am still trying to figure out what *you* think you
ARE gushing about. (And my quest is not helped by such gems as "all though
[sic] it may not be new to you, it seems to be new to some")
Why don't you just answer my question? Do you believe that this is some
sort of huge conceptual breakthrough? For NLP (as you were initially
pushing) or just for some nice computational tricks?
I'll also note that you've severely changed the focus of this away from
the NLP that you were initially raving about as such quality work -- and
while I'll agree that kernel mapping is a very elegant tool -- Collin's work
is emphatically *not* what I would call a shining example of it (I mean,
*look* at his results -- they're terrible). Yet you were touting it because
of your 500,000 dimension fantasies and you're belief that it's good NLP
work.
So, in small words -- and not whining about an attack -- what precisely
are you saying?
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=73247008-aecb7f