Eric Baum wrote:

The argument, in very brief, is the following. Evolution found a
very compact program that does the right thing. (This is my
hypothesis, not claimed proved but lots of reasons to believe it
given in WIT?.) Finding such programs is NP-hard.

Richard> Hold it right there.  As far as I can see, you just asserted
Richard> the result that is under dispute, right there at the
Richard> beginning of your argument!

First, above I was discussing finding an understanding system,
not necessarily a language understanding system-- say a monkey,
not a person. Then I went on to talk about additional problems
coming when you want language.

Richard> Finding a language-understanding mechanism is NP-hard?

Richard> That prompts two questions:

Richard> 1) Making statements about NP-hardness requires a problem to
Richard> be formalized in such a way as to do the math.  But in order
Richard> to do that formalization you have to make assumptions, and
Richard> the only assumptions I have ever seen reported in this
Richard> context are close relatives of the ones that are under
Richard> dispute (that grammar induction is context free,
Richard> essentially), and if you have made those assumptions, you
Richard> have assumed what you were trying to demonstrate!

At this point, I suggest again you read What is Thought?
In emails, I am skipping a lot of corners and not giving caveats
and whatnot, to give 3 paragraph summaries.

But if you don't want to take the time, I'll cut to the chase.
I am not claiming to have proved that building a human is NP-hard.

Eric,

I am having serious difficulty here: I made a very specific point, originally, and you made a reply to that - but your replies are wandering off into other topics. For example: I neither thought nor implied that your claim was "to have proved that building a human is NP-hard," so I am puzzled why you should say this.


I am suggesting a theory of thought, for which I think there is a lot of evidence and basis. Computational learning theory has more
or less established that generalization (never mind thought) follows
from finding constrained-enough hypothesis. And in every case that has
been studied of sufficient richness to be really interesting, it turns
out that the problems you have to solve are NP-hard. So naturally, in
my extrapolation to thought, I expect that the problems you will have
to solve here are NP-hard as well. This isn't exactly rigorous,
but its the way to bet. Your protests are mostly wishful thinking.

Its also true that proving something is NP-hard doesn't prove its insoluble, or even hard in the average case, or hard in any particular
case, or any of that. Hell, it might even be true that P=NP.
But there are a lot of strong reasons to think that all of this is
also wishful thinking, and that NP-hard problems are really hard.
I'll say again, if you don't believe that, you shouldn't be using cryptography, because cryptography as practiced not only relies on
P!=NP, but much much stronger assumptions, like its very hard to
factor *random* products of primes, and factoring isn't even NP-hard.

Its clear from what you are writing here that you are not familiar
with computational learning theory, or computational complexity
theory. I strongly suggest you read WIT?. I think you will learn a lot.

[Please don't be tempted to make comments about my general level of expertise in this or that field. You are mistaken in this, but I am not going to argue about it].

My understanding of these areas is reasonably deep, though not complete, and I have made specific points that (if you will read Pei Wang's comment) are being reinforced by others more expert than myself.

As for COLT, you have said something that I completely disagree with: "Computational learning theory has more or less established that generalization (never mind thought) follows from finding constrained-enough hypothesis." That would be true if you accepted the narrow characterization of "generalization" that is used in COLT, but it is most emphatically not true if (as I do) you consider this narrow reading to be only a trivial version of the mechanism to be found in the human cognitive system.

This is an absolutely crucial point. The COLT people have decided to use the word "generalization" to describe a formal process that THEY define, and the reason they define it their way is that their narrow definition makes it amenable to mathematical proofs (e.g. proofs that THEIR version is an NP-Hard problem). Frankly, I don't care if they can prove that their version is NP-Hard, because nothing follows from it.

Other people do not take that narrow view, but instead consider "generalization" to be a much more complicated process (probably a cluster of several processes) defined on a system that is not nearly as simple in structure as the systems that COLT people use ..... and for all these reasons, there is no way to get a handle on this larger version of generalization, sufficient to decide whether it is NP-Hard or not.

Do you discuss this issue in your book? Have you explictly talked about the fact that some people dispute the validity of the COLT interpretation of generalization? If you did cover all this in your book, and gave a good analysis of the pros and cons for accepting COLT interpretation, I would be impressed indeed, and eager to find out what you have to say.

But as it is, I am fairly sure that this is news to you: that you buy the COLT interpretation lock, stock and barrel, and are not even aware that everything you say about the NP-Hardness of learning (and by extension, thought) hinges on whether or not you buy into their interpretation.

[A rhetorical side-note: Just for the record, I am not attacking your general knowledge of an entire field: I am saying that you appear not to know about this specific issue.]

And if you are not aware of the fact that the COLT interpretation is contested, you have been arguing from a castle built on sand - and in light of that, it is a little difficult to hear "Your protests are mostly wishful thinking." ;-)



Chapter 4 is a review of the results in computational learning
theory that is meant to be accessible. Chapter 11 is a review of Complexity Theory, also meant to be accessible. The book is, I think,
written from a point of view that you will find amenable-- I am very
much focussed on semantics, I see the world a lot like you do,
except I am bringing the conclusions of computational learning theory,
complexity theory, and linguistics to bear. I am not asserting these
conclusions-- I question each one and discuss data pro and con.
Where they seem very well founded and relevant, I discuss their implications. I also am not arguing rigorously. I am extracting
from these disciplines the intuition behind rigorous results,
and extending it to give what I argue is a compelling view of what cognition is.

Now you see, the funny thing is that I have had the same thoughts myself: in some ways you and I *do* agree about certain aspects of the problem. The only thing is that I don't buy some of the specific, very strong points that you are making with respect to the computational complexity of processes like grammar induction and the evolutionary construction of learning systems.

We are coming from similar points of view, but reaching diametrically opposed conclusions.




Richard Loosemore.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to