Dear Professor Zadeh and our colleagues :-

> What we know about the probability distribution ,P, is that its mean is 
>  approximaately a and its variance is approximately b, where 
>  "approximately a" and "approximately b" are fuzzy  numbers defined by 
>  their membership functions.  The question is:  What is the 
>  entropy-maximizing  P ?  In a more general version, what we know are 
>  approximate values of the first n moments of P.   Can anyone point to a 
>  discussion of this issue in the literature?

    If you mean "is there an author who both practices entropy maximization 
and also uses membership functions to define (fuzzy) numbers?," then you would 
be as well-situated as anyone on this list to name one. It is a remarkable 
coincidence of tastes.

    If you mean "is there a maxent literature where the prior information is 
imperfectly known moments?," then this is a typical, rather than an 
exceptional situation. Try: 

E.T. Jaynes, Prior probabilities, IEEE Tansactions on Systems Science and 
Cybernetics 4, 1968, 227-241.

    Yes, maximum entropy assumes the existence of "a procedure which
will determine unambiguously whether [a candidate prior] does or does
not agree with the information" used to state a problem.

    It is the procedure, not the form in which the information is
stated, which bears the burden of being decisive. Little else is asked
of the procedure, which can be crude and expedient. Jaynes' example in
section III features imperfect knowledge of a moment being processed
by bald rounding-off. So much for the Gordian knot of "approximately."

    This insensitivity to nuance is explained in an extensive
"robustness" literature, which includes maxent contributors (example
offered for establishing existence: C.C. Rodriguez, Bayesian
robustness: a new look from geometry, in G.L. Heidbreder (ed.)
_Maximum Entropy and Bayesian Methods_, Kluwer 1996, pp.  87-96).

    Finally, there is a charming substitute for robustness called
"opportunity to learn," also anticipated in the Jaynes cited, but that
is another thread.

    Happy hunting.

                                                            Paul


Reply via email to