In article <[EMAIL PROTECTED]>,
John Bailey <[EMAIL PROTECTED]> wrote:
>On 01 Apr 2002 18:21:18 GMT, [EMAIL PROTECTED] (Michael J Hardy) wrote:

>>> >    Look:  Most practitioners of Bayesian inference probably do not
>>> >know what entropy is.  That appears to contradict what you said in
>>> >your posting that I first answered.  Can you dispute that?

>In an earlier post, John Bailey's response to Hardy's statement was:
>>> I will definitely dispute the first part.  
>>> I suppose there may be *practitioners of Bayesian inference
>>> who are weak on the concept of entropy* but it is clearly and
>>> unambiguously a part of the theory of its use.

>Mike Hardy then replied:
>>     I don't doubt that people you worked with are familiar with
>>entropy, nor that some people who do Bayesian inference use entropy,
>>but it is perfectly obvious that such familiarity is not needed in
>>order to do Bayesian inference.  Why do you call it "clearly and
>>unambiguously a part of the theory of its use"?

>In my exposures to Bayesian methodology all have included a discussion
>of how to determine a neutral Bayesian prior and the use of maximum
>entropy as a means to that end.

>John 

Statistics is not methodology.  Treating it as such causes
people to use totally inappropriate procedures.

The first thing is to state the problem, and stating a 
mathematically convenient formulation can be worse than
useless.  Bayesian reasoning requires that the USER be
the provider of the loss-prior combination.  Now one
might want to use something simpler if it can be proved
to be reasonably good.

So we can use least squares without normality, as the
Gauss-Markov Theorem tells us that the results are just
about as good without normality as with.  This is not
true for using mathematically convenient but inappropriate
priors.  Also, it is not how well the prior is approximated,
but how well the solution is.

Bayesian priors should not be "neutral", unless it can
be shown that not much is lost by using such a prior.
Conjugate priors, "uninformative" priors, maximum entropy
priors, as such are unjustified computational copouts.



-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to