Thank you for the links!
One trouble about Bayesian statistics is the naming of probability. It is 
problematic to talk about the probability of life on Mars, because we do not 
have a big population of Marsplanets such that we can count the proportion of 
Marsplanets containing life.
Life on Mars is a hypothesis. Hypotheses do not have probabilities. Hypotheses 
have credibilities. It makes sense to talk about the credibility that there is 
life on Mars. 
This tiny change of wording may help accepting the Bayesian approach.
Secondly, the notation P(A|B) for the conditional probability of event A, given 
hypothesis B, can be simplified by omitting the letter P and writing simply 
(A|B). The unconditional probability of A is then written (A|0), and Bayes' 
rule reads
    (A|B)(B|0) = (A|0)(B|A)
meaning that the conditional probability times the prior credibility equals the 
unconditional probability times the posterior credibility.
Thirdly, consider these two J programs:
deduce=.%~`*`:3"2@(,: (%:@* -.))@(+/@[ %~ 1 , ,:)
 predict=.(deduce~ -@>:)~
They compute mean values and standard deviations.
If a population contains 20 items of category 1 and 5 items of category 2 and 0 
items of category 3, then a random sample of 10 items will contain 8±1  items 
of category 1 and 1±1 items of category 2 and 0±0  items of category 3. This 
was computed with program deduce like this:
   20 5 0 deduce 108 2 01 1 0
If a sample contains 20 items of category 1 and 5 items of category 2 and 0 
items of category 3, then another sample, of 10 items, will contain 7.5±1.6  
items of category 1 and 2.1±1.5 items of category 2 and 0.4±0.7  items of 
category 3. This was computed with program predict like this:
   20 5 0 predict 10 7.5 2.14286 0.3571431.56745 1.48533 0.671764
The close relationship between deduce and predict is the main result of this 
research: Statistical induction and prediction
Thanks. Bo.
 

    Den 6:39 tirsdag den 29. marts 2016 skrev Ian Clark <[email protected]>:
 
 

 You're welcome, Devon.

I've quite forgotten I'd done that. I recognise the article, now I see
it again. A bit of a Bayesian myself, I must have repaired it because
I remember thinking at the time what a good intro to Bayesianism it
was, with applications to human factors topics I was deeply into
throughout the 1980s. Yes, it needs J versions of the APL functions.

Amazed the APL chars have survived the migration to WikiMedia. Feel
free to strip out my initials etc and take back ownership of your
essay.

On Tue, Mar 29, 2016 at 1:14 AM, Devon McCormick <[email protected]> wrote:
> Hi - I've put up some links relating to my recent talk introducing Bayesian
> statistics - http://www.sigapl.org/BayesianLinks.php .
>
> Thanks to Ian Clark for reviving my old APL essay on this (
> http://code.jsoftware.com/wiki/User:Devon_McCormick/DynamicLinearModels/BayesianFinancialDynamicLinearModel_iac)
> by fixing the broken APL characters.  I may get around to re-doing this in
> J one of these days.
>
> --
>
> Devon McCormick, CFA
>
> Quantitative Consultant
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

 
  
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to