My profuse thanks to those who offered constructive comments on the 
maximum entropy principle. My brief responses follow.

Kathy Laskey (7-l9-03)

Dear Kathy:

Please accept my apology for ascribing to you a view on the adequacy of 
probability theory which, in fact, you had not expressed. But what 
surprises me is your unqualified acceptance of decision-theoretic 
arguments. In reality, there are not many fields which are as unsettled 
as decision analysis. This is a widely-held view among those who, like 
me, have a lifetime of experience in dealing with decision problems.

Despite the enormous literature, there are no decision principles which 
are uncontested. The only case that has an obvious answer is the 
following. There are two options: A and B. If you choose A you get a , 
and if you choose B you get b , with a greater than b .Clearly, you 
would choose A. But let us change the problem slightly. If you choose A 
you get a, and if you choose B you get b or c, with a lying between b 
and c. Would you choose A or B? No decision principle can be used to 
answer this question. If we alter the problem by associating b and c 
with respective probabilities p and l-p, the question remains 
unanswerable. We can use, of course, the principle of maximization of 
expected utility, but it is well known that the principle leads to 
counterintuitive conclusions ( Allais' paradox).

In addition to these problems, there is a fundamental issue which 
precludes the possibility of constructing a definitive decision theory. 
Specifically, the question is how can the possibility of an unexpected 
event be considered in decision analysis? In all realistic seettings, 
this is an issue that has to be addressed. And yet, the nature of 
unexpectedness is such that it is impossible to concretize what 
unexpected events may occur and with what probabilities.

With regard to perceptions, my approach to perceptions is described in 
the paper " A New Direction in AI -- Toward a Computational Theory of 
Perceptions," which appeared in the Spring 200l issue of the AI 
Magazine. A key idea in my approach is that of dealing not with 
perceptions per se, but with their descriptions in a natural language.

With regard to the relationship between probability theory and fuzzy 
logic, they are complementary rather than competitive. A more radical 
view which I hold at this juncture, is that probability theory should be 
based on fuzzy logic rather than on bivalent logic ,as it is at present. 
Note that fuzzy logic is a generalization of bivalent logic. As a 
consequence, fuzzy- logic-based probability theory is more general and 
more attuned to the real-- pervasively imprecise-- world than standard 
bivalent-logic-based probability theory. I realize, of course, that it 
may take some time for this to happen, but I have no doubt, that 
eventually probability theory will have fuzzy logic as its foundation.

With cordial regards.

Lotfi

Paul Snow (7-25-03)

Dear Paul:

Though I have very high regard for your analytical ability, I cannot 
agree with your contention that the issue of maximization of entropy 
subject to imprecise side-conditions is adequately treated in the Jaynes 
paper. The issue is much too complex to be treatable as an instance of 
round-off error. (See the paper of James Buckley.) Our disagreement can 
easily be resolved by your providing an answer to the following 
question. What is the entropy-maximizing probability distribution when 
what we know is that its mean is approximately a and its variance is 
approximately b, using your own definitions of "approximately a " and 
"approximately b " . An example of" approximately a" is : Usually it 
takes me approximately twenty minutes to drive to the campus.

With cordial regards.

Lotfi


James Buckley

Dear Jim:

Many thanks for your presentation of what you put forth as the solution 
of the maximum entropy problem.(Submitted to "Soft Computing," 
[EMAIL PROTECTED]) Unfortunately, your solution does not answer the 
basic question: What is the entropy-maximizing distribution, P, when the 
side-conditions are imprecise? What you do is this: We know that when 
the mean and variance are specified the entropy-maximising distribution 
is Gaussian. Using the extension principle, or equivalently level sets, 
P becomes a Gaussian distribution with a fuzzy mean and a fuzzy 
variance. Thus, P is a fuzzy set of Gaussian distributions. But this is 
not what we seek. What we seek is a unique entropy-maximizing 
probability distribution. To find such distribution, we have to form the 
conjunction of goals and constraints, as described in my l970 paper with 
Bellman,"Decision-making in a Fuzzy Environment."

Cordial regards.

Lotfi



Christopher Elsaesser ( 7-24-03)

Dear Christopher:

As I have stated in my earlier messages, human perceptions are 
intrinsically imprecise. For example, if I look at Mary, my estimate of 
her age expressed as "about 50" does not have sharp edges. Of course, I 
could estimate her age as an interval [a,b] with the understanding that 
her age is guaranteed to be within this interval. The problem with 
interval estimation is that the interval must be wide to guarantee that 
the estimate is correct.

With cordial regards.

Lotfi


Andrzej Pawnuk (7-28-03)

Dear Andrzej:

Your examples illustrate the point I made in my message(7-l6-03), 
namely, that standard probability theory, PT, does not address problems 
in which, as in your examples, we encounter partiality of truth and/or 
partiality of possibility. Thus, in the proposition, "Robert is half- 
German, quarter- French and quarter- Italian," the numbers 0.5, 0.25 and 
0.25 are not probabilities but grades of membership or, equivalently, 
truth values.

Cordially yours,

Lotfi



In conclusion, I should like to thank the respondents for offering their 
constructive comments and criticisms. I am not sure, though, that I 
succeeded in persuading the respondents that the maximum entropy 
principle is not applicable when the side-conditions are imprecise, as 
they are in most realistic settings.




Lotfi



-- 
Lotfi A. Zadeh
Professor in the Graduate School, Computer Science Division
Department of Electrical Engineering and Computer Sciences
University of California
Berkeley, CA 94720 -1776
Director, Berkeley Initiative in Soft Computing (BISC)

Address:
Computer Science Division
University of California
Berkeley, CA 94720-1776
[EMAIL PROTECTED]
Tel.(office): (510) 642-4959
Fax (office): (510) 642-1712
Tel.(home): (510) 526-2569
Fax (home): (510) 526-2433
Fax (home): (510) 526-5181
http://www.cs.berkeley.edu/People/Faculty/Homepages/zadeh.html

BISC Homepage URLs:
URL: http://www-bisc.cs.berkeley/
URL: http://zadeh.cs.berkeley.edu/

Reply via email to