Dear NMusers,

There have been some elegant references posted to the usersnet in response to 
this question. However, one question has generally gone unanswered. Navin tells 
us that NONMEM says MINIMIZATION TERMINATED DUE TO ROUNDING ERRORS (ERROR=134); 
then he change the SIGDIGITS to a lower value (this is generally NSIG=2) and 
MINIMIZATION SUCCESSUFUL shows up along with the STANDARD ERROR OF ESTIMATE. 
This is actually one of the recommended tips in Dr. Bonate's book for Error 
134. Dr. Bonate further explains "If the rounding error is a variance component 
then this is usually an acceptable solution".

 <?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office" />

Kindly note that almost always the parameter estimates are identical between 
the run that had minimization terminated and the job that ran successfully with 
NSIG=2. Could someone kindly teach us what makes NSIG=3 so important. As an 
example, is a volume estimate of 96.3 L that much better than 93 L.

 

Please advice...Mahesh



-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Mark Sale - Next 
Level Solutions
Sent: Friday, July 20, 2007 9:05 PM
Cc: nmusers
Subject: RE: [NMusers] Minimization terminated ?



Nick,
    I'm interested in exactly what you mean by "unreliable".  Is it 
sensitivity/specificity for a "bad" model?  I suspect that we all would prefer 
if our models converge and have a successful covariance step.   And so (I 
think), models that pass these tests are "better" models than those that don't 
(everything else being equal).  But, if we are unable to find a model that 
passes these tests, we resort to rationalizing that it really doesn't make any 
difference, anyway, and so I can move on.  You, I, and others have generated 
data that support this.  On the other hand, Stuart would, I'm pretty sure, 
suggest that models that fail a covariance step should not be considered final, 
and would cringe at the idea of accepting as final a model that did not 
converge.  I'd also suggest it might be hurdle in getting a paper published.  
(I'll let the regulatory agencies speak for themselves on this matter)  So, I'd 
suggest that convergence and a covariance step are valuable information and 
should not be discarded. 
 But, I very much support the value of visual predictive checks, and NPDE.  I'd 
like to add PPC, especially if one checks both a point estimate (AUC, Cmax, 
Cmin) and some measure of variability (SE of AUC etc), since an artificially 
large variability can fool PPC.
  

Mark Sale MD
Next Level Solutions, LLC
www.NextLevelSolns.com





-------- Original Message --------
Subject: Re: [NMusers] Minimization terminated ?
From: Nick Holford <[EMAIL PROTECTED]>
Date: Fri, July 20, 2007 5:04 pm
To: nmusers <nmusers@globomaxnm.com>


Navin,

NONMEM is quite unreliable when it comes to deciding if it has
converged. Minor changes in initial estimates with essentially no
difference in the final estimates and OBJ can produce 1) SUCCESSFUL +
COVARIANCE 2) SUCCESSFUL + FAILED COVARIANCE 3) TERMINATED. 

My guess this is because of numerical rounding errors (not the ones
that NONMEM refers to in its error message) so that essentially it
becomes a random event which of these outcomes you get. The bottom
line is NOT to pay attention to NONMEM's declarations of success but
to focus on whether the parameters make sense, whether the fits look
good, does a VPC look OK 
http://www.page-meeting.org/page/page2005/PAGE2005P105.pdf

and even (if you have got lots of spare time) does the npde fail to

reject the null.

http://www.page-meeting.org/pdf_assets/9146-ecomets_a4page07.pdf



Several investigations of bootstraps have shown that it makes little

difference if you include successful runs only or if you include all

runs. The advantage of all runs is that is simpler to process the

results and perhaps the confidence intervals are more precisely

estimated

because you have more runs.

http://www.cognigencorp.com/nonmem/nm/99jul152003.html

http://www.nature.com/clpt/journal/v77/n2/abs/clpt200514a.html

http://www.page-meeting.org/?abstract=992



Nick



navin goyal wrote:

> 

> Dear NM Users,

> I am using trying to model some POPPK data in NONMEM vi

> Sometimes I get the  following message in the output file

> 

> MINIMIZATION TERMINATED

>  DUE TO ROUNDING ERRORS (ERROR=134)

>  NO. OF FUNCTION EVALUATIONS USED: 1103

>  NO. OF SIG. DIGITS UNREPORTABLE

> 

> But when I change the SIGDIGITS to a lower value the minimization is

successful. What exactly

is happening in this case ? Is there something I am missing out?

> 

> what about the parameter estimates obtained in such a run ?

> 

> Another question related to this is that when I bootstrap a model in

wings for nonmem WFN, I

get few runs with similar message where in it also says the same

message as above

..MINIMIZATION TERMINATED DUE TO ROUNDING ERRORS (ERROR=134) NO. OF

SIG. DIGITS UNREPORTABLE.

> This means that I discard these runs from the calculations ?

> 



--

Nick Holford, Dept Pharmacology & Clinical Pharmacology

University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New

Zealand

n.holford <http://email.secureserver.net/pcompose.php#Compose> @auckland.ac.nz 
tel:+64(9)373-7599x86730 fax:+64(9)373-7090
www.health.auckland.ac.nz/pharmacology/staff/nholford

Reply via email to