Dear Kathy:
In your thought-provoking comment you touch upon a number of basic
issues. In my response, I will focus on just a few.
Underlying our exchanges is a basic difference in our views
regarding the sufficiency of standard probability theory, call it PT,
to deal with uncertainty and imprecision. In your view, which at
present is an overwhelmingly majority view, PT is sufficient. In my
view, which at present is an overwhelmingly minority view, it is
not. An exposition of my view may be found in the paper entitled
"Toward a Perception-Based Theory of Probabilistic Reasoning with
Imprecise Probabilities," which appeared in a special issue on
imprecise probabilities of the Journal of Statistical Planning and
Inference, Vol. 105, pp. 233-264, 2002, (downloadable at
http://www-bisc.cs.berkeley.edu/BISCProgram/Projects.htm). In the
preface to the issue, the co-editor, Professor Jean-Marc Bernard, has
this to say:
"There is a wide range of views concerning the sources and
significance of imprecision. This ranges from de Finetti's view, that
imprecision arises merely from incomplete elicitation of subjective
probabilities, to Zadeh's view, that most of the information relevant
to probabilistic analysis is intrinsically imprecise, and that there
is imprecision and fuzziness not only in probabilities, but also in
events, relations and properties such as independence. The research
program outlined by Zadeh is a more radical departure from standard
probability theory than the other approaches in this volume."
Please note that my critique of PT is constructive in the
sense that what is suggested in my JSPI paper is a generalization of
PT which can and should enhance the ability of probability theory to
deal with real-world problems.
The point of departure on my approach is the observation that
there are two concepts which play a key role in human cognition,
partiality and granularity. Partiality relates to the fact that most
human concepts are partial in the sense that they are associated with
a scale, that is, are a matter of degree. Thus, we have partial
understanding, partial knowledge, partial similarity, partial truth,
partial certainty and partial possibility, with the last three
standing out in importance. Furthermore, reflecting the bounded
ability of sensory organs, and ultimately the brain, to resolve detail
and store information, the scale is granulated, with a granule being a
clump of values drawn together by indistinguishability, similarity,
proximity and functionality. For example, the granules of likelihood
might be likely, unlikely, very unlikely, etc.
Standard probability theory, PT, addresses partiality of
certainty, but what it does not address is partiality of truth and
partiality of possibility. As a consequence, PT does not have the
capability to deal with perceptions, which are intrinsically imprecise
and, in general, involve partiality of certainty, truth and
possibility. For example, the perception: It is very unlikely that it
will be a warm day tomorrow, involves an imprecise perception of
likelihood and an imprecise perception of temperature.
Standard probability theory provides no machinery for (a)
representing the meaning of perceptions described in a natural
language; and (b) reasoning with them. An example is: Usually Robert
leaves office at about 6 pm and arrives at home about half an hour
later. When does Robert arrive at home?
Perceptions, and especially perceptions of likelihood, are
intrinsically imprecise. This reality is overlooked in subjective
probability theory--a theory in which subjective probabilities are
assumed to be crisply defined. Thus, if I am asked, "What is the
probability that tomorrow will be a warm day," then using the analogy
of the spinner, my perception would correspond to a wedge with fuzzy
rather than crisp edges. In other words, my perception would be a
fuzzy rather than crisp probability. (See the book "Fuzzy
Probabilities," by J. Buckley, Springer-Verlag, 2003.) What this
means is that axiomatization of subjective degrees of belief cannot be
accomplished within the conceptual structure of bivalent logic.
With regard to the maximum entropy principle, you seem to
agree that maximization subject to imprecise constraints involves a
great deal of arbitrariness. As a consequence, the uniqueness of
entropy-maximizing distribution is lost.
In summary, what is almost universally unrecognized is that
standard probability theory has fundamental limitations which are
rooted in the use of bivalent logic as its foundation. Abandonment of
bivalence is a prerequisite to enhancing the power of probability
theory to deal with real-world problems.
With my warm regards,
Lotfi
--
Professor in the Graduate School, Computer Science Division
Department of Electrical Engineering and Computer Sciences
University of California
Berkeley, CA 94720 -1776
Director, Berkeley Initiative in Soft Computing (BISC)