On Mon, Feb 25, 2019 at 4:06 PM Rob Freeman <chaotic.langu...@gmail.com>
wrote:

>
>
> Permutation,
>

You would have to make some concrete example to take away. Permutations are
also extremely common. Knowledge of the permutation groups and young
tableaux is a fundamental pre-req before studying Lie algebras.
Differential geometry and grassman algebra is all about sign-change under
permutation. The shuffle-product of tensor algebras is a certain kind of
order-preserving permutation.

The Pissantzky permutations *are exactly the same thing* as the
permutations in the trace monoid op. cit.  That's why he's talking about
permutations. Everyone does, for that particular kind of system.

Maybe I can't see the tree for the forest. Show me a tree.


>
> LV, Feb 9: "If you are picky, you can have several different kinds of
> common nouns, but however picky you are, you need fewer classes than there
> are words."
>
> You don't address this at all in your reply,
>

I thought I did. Noah Webster. Published a dictionary. Webster's
dictionary. Pointed out there are nouns, and there are verbs. That's two
classes. People have been fine with this ever since.



>   This is a crucial point.
>

? You want more than just "nouns" and "verbs"?  Yes there are more. So what?

>
>
> Do you suspect your use of "gauge" is an abuse?
>

Do I need to point out that I have a PhD in theoretical particle physics,
and know *exactly* what gauge symmetry is, in tremendous detail? I've never
heard of the word "gauge" used in linguistics before, and if there is such
a usage, I cannot imagine that it has anything at all to do with gauge
symmetry in physics. Now, neural nets *do* have a certain kind of gauge
symmetry, but I doubt that most neural-net people know or care; those who
have spotted it almost surely have a doctorate in physics (gauge-theory is
not an undergrad topic). And they probably said "huh, that's interesting",
shrugged and moved on.


>
> Do you assert that such "ambiguity of factorization" is insignificant in
> natural language (even only "mostly", which is still non-zero and might be
> significant)? On what evidence?
>

I'm asserting that there is a matrix M that can be factorized into a
product M=AB  The two factors are A and B, both matrices.  Into that
product, one is free to insert the identity matrix written as I=R^{-1}R
where R is a matrix and R^{-1} is it's inverse, so that R^{-1}R is the
identity.  One then has:

      M = AB = A I B = AR^{-1}RB = (AR^{-1}) (RB)

so that if A and B are factors, then so are (AR^{-1}) and  (RB)  Both are
equally good as factors. Thus, the factorization is completely ambiguous:
there are an infinite number of R you can stick in there. This is called a
"global gauge symmetry".  What you do next is up to you. Some people like
to "fix the gauge" by demanding that the two factors obey some additional
properties, depending on what they want to do. Maybe they want A to be
upper-triangular ("solvable") Maybe they want it to be symmetric or
anti-symmetric or diagonal or real or hermitian or orthogonal or maybe
"numerically stable" or whatever. Any of these additional requirements may
or may not "fix the gauge" partially or completely.

This can also be seen in neural nets. Given a collection of vectors v times
a weight matrix W, you can multiply the v times a rotation matrix, as long
as you *also* multiply the weight matrix by the inverse rotation.  That is,
for the product v.W  you have v.W = (vR^{-1}) . (RW) so that vR^{-1} is
also a vector, and RW is also a weight matrix, and they both describe
exactly the same neural net, in the same way.



>  Certainly everybody was stumbling at this step in the '90s, by the likes
> of David Powers and Hinrich Schuetze trying to learn
>

To the best of my knowledge, exactly zero people have ever done this
before, or anything remotely like it. Not even a little bit close. So
excuse me.



>
> You "don't know what to do with this" so you ignore it and move on?
>

Yep. Life is short. I'm sorry, the conversation is kind-of ridiculous. If
there is something specific and concrete you want to say, or want me to
know, out with it. Insinuating something, I don't know what to do with
that.  I'm certainly not going to read Chomsky from 60 years ago. There's a
lot of stuff he said back then that was absurd and wrong. People didn't
know any better back then; the entire discipline is now 60 years older and
wiser and more knowledgeable. Maybe a historian of science can do something
with this; I don't see how I'm going to learn one tiny iota by visiting
that old ground.

All this is a distraction. I really need to be doing something else.

--linas

>
>
> -Rob
>
> On Mon, Feb 25, 2019 at 6:30 PM Linas Vepstas <linasveps...@gmail.com>
> wrote:
>
>>
>>
>> On Sun, Feb 24, 2019 at 6:34 PM Rob Freeman <chaotic.langu...@gmail.com>
>> wrote:
>>
>>>
>>> Where I see the gap might be summarized with one statement by Linas:
>>>
>>> LV, Feb 9: "If you are picky, you can have several different kinds of
>>> common nouns, but however picky you are, you need fewer classes than there
>>> are words."
>>>
>>> "...however picky you are, you need fewer classes than there are
>>> words..."
>>>
>>> "Fewer classes"?
>>>
>>> How do we know that is true?
>>>
>>
>> Quoting out of context is dangerous, and can lead to misunderstanding. We
>> know its true because Noah Webster pointed it out in 1806 and pretty much
>> everyone has agreed with him, ever since.
>>
>>
>>> Experimentally, that language "gauge" is not pointless, was observed in
>>> the results for the distributional learning of phonemes I cited, dating
>>> back 60 years, to the time Chomsky used it to destroy distributional
>>> learning as the central paradigm for linguistics.
>>>
>>
>> I doubt that the use of the word "gauge" there has anything at all to do
>> with the word "gauge" of "gauge theory".   In physics, "gauge" has a very
>> specific and precise meaning; I suspect that there's an abuse of that word,
>> here.
>>
>>
>> > Sergio Pissanetzky comes close to this, with his analysis in terms of
>> permutations, also resulting in a network:
>>
>> > "Structural Emergence in Partially Ordered Sets is the Key to
>> Intelligence"
>> http://sergio.pissanetzky.com/Publications/AGI2011.pdf
>>
>> I'll look. Is it actually that good?
>>
>>
>>>
>>>   Currently they are stumbling at the "clustering" step'
>>>
>>> Sure they will!
>>>
>>
>> It's more mundane than that. They haven't bothered with some basic steps
>> of data analysis. Getting them to actually filter out the junk from the
>> valid data is the proverbial "like pulling teeth", they don't want to do it
>> and insist that it'll be fine but then complain about the results.  It's
>> ordinary lab science, repeated around the world dozens of times a day:
>> experiments don't work because of contaminants and interference.
>>
>>
>>>
>>>
>>> Looking up what you are doing with your "MST" parser. You start by
>>> clustering links, starting with mutual information.
>>>
>>
>> It's just one lens in a microscope assembly. It does a certain task, it
>> does it OK-ish, what matters is it's role in the entire assembly, and not
>> as a stand-alone component. Too many people are too obsessed about it as a
>> stand-alone component.
>>
>>
>>>
>>>
>>> OK, you may be right that BERT models may have an advantage that Linas
>>> doesn't see, because deep nets do allow some recombination of parts, within
>>> a network layer, at run time.
>>>
>>
>> I also haven't studied BERT. What's more important, BERT or Pissanetsky?
>>
>> --linas
>>
>>>
>>>
>>
>> --
>> cassette tapes - analog TV - film cameras - you
>>
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Ta6fce6a7b640886a-M23bf174eb9b5eec4de3dd35c>
>


-- 
cassette tapes - analog TV - film cameras - you

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta6fce6a7b640886a-Mb3e4c6d9c3c12fb05fe7ab79
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to