Thanks Ben for your tips.
I'm not sure it'll be so easy to do (as the non-feasible regions depend
on the model parameters), but I'm sure it's worth giving a try.
Thanks !!!
Best,
Mathieu
Ben Bolker a écrit :
Mathieu Ribatet mathieu.ribatet at epfl.ch writes:
Dear list,
I'm currently
If the positive definiteness of the covariance
is the only issue, then you could base a penalty on:
eps - smallest.eigen.value
if the smallest eigen value is smaller than eps.
Patrick Burns
[EMAIL PROTECTED]
+44 (0)20 8525 0696
http://www.burns-stat.com
(home of S Poetry and A Guide for the
Dear Patrick (and other),
Well I used the Sylvester's criteria (which is equivalent) to test for
this. But unfortunately, this is not the only issue!
Well, to sum up quickly, it's more or less like geostatistics.
Consequently, I have several unfeasible regions (covariance, margins and
If I understand your proposal correctly, then it
probably isn't a good idea.
A derivative-based optimization algorithm is going
to get upset whenever it sees negative infinity.
Genetic algorithms, simulated annealing (and I think
Nelder-Mead) will be okay when they see infinity
but if all
Dear list,
I'm currently writing a C code to compute the (composite) likelihood -
well this is done but not really robust. The C code is wrapped in an R
one which call the optimizer routine - optim or nlm. However, the
fitting procedure is far from being robust as the parameter space
depends
Mathieu Ribatet mathieu.ribatet at epfl.ch writes:
Dear list,
I'm currently writing a C code to compute the (composite) likelihood -
well this is done but not really robust. The C code is wrapped in an R
one which call the optimizer routine - optim or nlm. However, the
fitting