Hi everybody,

I am currently working on mixtures of two densities ( f(xi,teta)=
(1-teta)*f1(xi) + teta*f2(xi) ),
particularly on the behavior of the variance for teta=0 (so sample only
comes from the first distribution).
To determine the maximum likelihood estimator I use the Newton-Rapdon
Iteration. But when
computing the first derivative I get a none linear function (with several
asymptotes) which is completely
absurd.

This is my function to compute the first derivative:

phy=function(teta,vect1,vect2){
    return( sum(( vect2 - vect1) / (( 1 - teta) * vect1 + teta * vect2)))
}

note: vect1 and vect2 contains  values of the  two distributions computed
from sample previously extracted.

Beside, vect2 - vect1 is constant and ( 1 - teta) * vect1 + teta * vect2) is
linear and always defined so I am expecting
a linear function for the first derivative. To my mind, it's likely comes
from the operation of division but I don't understand
why results are skewed.

Have you got any suggestions, please?

-- 
Clément Viel
Student in engineering school of Polytech-Lille
http://www.polytech-lille.fr

        [[alternative HTML version deleted]]

______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to