Its exactly what I derived myself, so I understand it :)
But it might be difficult for causal reader.

My suggestions:
- you could add factor graph to ease thinking about it.
- [most important] describe what x, sigma_i, and u_i are
- [important] you could explicitly state bayes theorem to derive
posteriori f(x).
- rename f(x) to P(X = x) or density p(x)
- you can comment that mean is at the mode (peak) as posterior likelihood is
- you should state what RAVE estimator is, and why it is biased
- [important] you should state your final estimator that is alternative to RAVE
- experimental setup would be useful :)

Lukasz

2008/9/23 Jason House <[EMAIL PROTECTED]>:
> On Mon, Sep 22, 2008 at 1:21 PM, Łukasz Lew <[EMAIL PROTECTED]> wrote:
>>
>> Hi,
>>
>> On Mon, Sep 22, 2008 at 17:58, Jason House <[EMAIL PROTECTED]>
>> wrote:
>> > On Sep 22, 2008, at 7:59 AM, Magnus Persson <[EMAIL PROTECTED]>
>> > wrote:
>> >
>> > The results of the math are most easilly expressed in terms of inverse
>> > variance (iv=1/variance)
>> >
>> > Combined mean = sum( mean * iv )
>> > Combined iv = sum( iv )
>> >
>> > I'll try to do a real write-up if anyone is interested.
>>
>> I am very interested. :)
>>
>> Lukasz
>
>
> Attached is a quick write up of what I was talking about with some math.
>
> PS: Any tips on cleanup and making it a mini publication would be
> appreciated.  I've never published a paper before.  Would this be too small?
>
_______________________________________________
computer-go mailing list
[email protected]
http://www.computer-go.org/mailman/listinfo/computer-go/

Reply via email to