Hi,

The two calls to self.relax.specific.model_free.model_statistics() and
self.relax.generic.model_selection.aic() were exactly what you needed.
 Sorry I couldn't help more, work has been flat out here and I don't
have internet at home yet (useless Deutsche Telekom!).  I've had a
close look at all the values using the software Grace and I think I
know what is happening.  From a distance, viewing all the values, it
looks like the algorithm has converged.  But if you zoom right in to
the 'flat' line you'll see what looks like a repetitive pattern
emerging.  It looks like you have stumbled on a circular loop in the
problem (see my paper at
http://www.rsc.org/publishing/journals/MB/article.asp?doi=b702202f or
chapter 5 of my thesis
http://eprints.infodiv.unimelb.edu.au/archive/00002799/ for a
description of the highly interlinked, yet unavoidable, mathematical
optimisation and statistical model selection problem).  Although the
parameter number is not changing, I have a feeling that there is are
small parameter changes occurring.  Possibly along the lines of one
residue having a tiny Rex value in one round and another residue
having nothing, and later on these swap.  My guess is that you are
circularly sliding between these different, yet statistically
indistinguishable, model-free universes in an infinite loop.  If you
ran the algorithm for a long time, maybe you'll eventually break out
of the loop and find the unique solution.

Despite there being no convergence, it's probably safe to terminate
the algorithm at this point.  Because the changes are so small, the
differences are unlikely to be statistically significant.  It is very
useful to note that this is happening though, and if you hunt down the
problematic two (or maybe a few more) parameters you should then see
exactly what is happening and judge for yourself its significance.
Again, sorry for not responding earlier.

Cheers,

Edward


On 6/25/07, Douglas Kojetin <[EMAIL PROTECTED]> wrote:
> Thanks Edward.  I got it to work using the following script:
>
> ##-start of script
>
> import glob
> runs = glob.glob('prolate/round_*')
> out=open('PARAMS.dat', 'w')
> # Loop over the runs.
> for name in runs:
>      run.create(name, 'mf')
>      results.read(run=name, file='results', dir=name+'/opt')
>      k,n,chi2=self.relax.specific.model_free.model_statistics
> (run=name, global_stats=1)
>      aic=self.relax.generic.model_selection.aic(chi2, k, n)
>      out.write( "%s: %d %d %0.30f %0.30f\n" % (name, k, n, chi2, aic) )
> out.close()
>
> ##-end
>
>
> """
> # round: k n chi2 aic
> prolate/round_1: 276 675 785.330531871414336819725576788187
> 1337.330531871414223132887855172157
> prolate/round_2: 274 675 786.656854782415166482678614556789
> 1334.656854782415166482678614556789
> prolate/round_3: 275 675 784.104495289329975094005931168795
> 1334.104495289329861407168209552765
> prolate/round_4: 275 675 783.543316702498373160779010504484
> 1333.543316702498486847616732120514
> prolate/round_5: 273 675 786.500523476859029869956430047750
> 1332.500523476859143556794151663780
> prolate/round_6: 275 675 784.433290432082458210061304271221
> 1334.433290432082458210061304271221
> prolate/round_7: 274 675 786.264734828735640803643036633730
> 1334.264734828735527116805315017700
> prolate/round_8: 274 675 785.887140331052023611846379935741
> 1333.887140331052023611846379935741
> prolate/round_9: 274 675 785.887140331170371609914582222700
> 1333.887140331170485296752303838730
> prolate/round_10: 274 675 785.887140331282466831908095628023
> 1333.887140331282353145070374011993
> prolate/round_11: 274 675 785.887140331283262639772146940231
> 1333.887140331283262639772146940231
> prolate/round_12: 274 675 785.887140331282807892421260476112
> 1333.887140331282807892421260476112
> prolate/round_13: 274 675 785.887140331283376326609868556261
> 1333.887140331283262639772146940231
> prolate/round_14: 274 675 785.887140331282921579258982092142
> 1333.887140331282807892421260476112
> prolate/round_15: 274 675 785.887140331282353145070374011993
> 1333.887140331282353145070374011993
> prolate/round_16: 274 675 785.887140331283262639772146940231
> 1333.887140331283262639772146940231
> prolate/round_17: 274 675 785.887140331052364672359544783831
> 1333.887140331052250985521823167801
> prolate/round_18: 274 675 785.887140331284172134473919868469
> 1333.887140331284172134473919868469
> prolate/round_19: 274 675 785.887140331283262639772146940231
> 1333.887140331283262639772146940231
> prolate/round_20: 274 675 785.887140331282694205583538860083
> 1333.887140331282807892421260476112
> prolate/round_21: 274 675 785.887140331284967942337971180677
> 1333.887140331285081629175692796707
> prolate/round_22: 274 675 785.887140331337491261365357786417
> 1333.887140331337377574527636170387
> prolate/round_23: 274 675 785.887140331283944760798476636410
> 1333.887140331283944760798476636410
> prolate/round_24: 274 675 785.887140331283376326609868556261
> 1333.887140331283262639772146940231
> prolate/round_25: 274 675 785.887140331282921579258982092142
> 1333.887140331282807892421260476112
> prolate/round_26: 274 675 785.887140331282353145070374011993
> 1333.887140331282353145070374011993
> prolate/round_27: 274 675 785.887140331283262639772146940231
> 1333.887140331283262639772146940231
> prolate/round_28: 274 675 785.887140331052364672359544783831
> 1333.887140331052250985521823167801
> prolate/round_29: 274 675 785.887140331284172134473919868469
> 1333.887140331284172134473919868469
> prolate/round_30: 274 675 785.887140331283262639772146940231
> 1333.887140331283262639772146940231
> prolate/round_31: 274 675 785.887140331282694205583538860083
> 1333.887140331282807892421260476112
> """
>
> Let me know if you would like any other information from this or
> other tensor rounds to track down the problem.
>
> Thanks,
> Doug
>
>
> On Jun 25, 2007, at 10:21 AM, Edward d'Auvergne wrote:
> > Unfortunately my scripts are archived on my personal laptop which I
> > don't with me here at work.  It may involve using certain relax
> > functions (not user functions) located in 'self.relax.generic' or
> > 'self.relax.specific'.  Most likely you will need
> > 'self.relax.specific.model_free.model_statistics()'.  I hope this
> > helps.
> >
> > Regards,
> >
> > Edward
> >
> >
> > On 6/25/07, Douglas Kojetin <[EMAIL PROTECTED]> wrote:
> >> Hi Edward,
> >>
> >> Once I figure out how to print the AIC and k values, I will send them
> >> along.  If you have a script example of this, it will save me some
> >> time [I've been working on this for an hour or so now without any
> >> luck].
> >>
> >> Doug
> >>
> >>
> >> On Jun 25, 2007, at 9:01 AM, Edward d'Auvergne wrote:
> >>
> >> > Hi,
> >> >
> >> > Would you be able to print the AIC and k values as well?  k is the
> >> > number of parameters in the model.  The places where the chi-
> >> squared
> >> > value increases rather than decreases is because of a collapse in
> >> > model complexity.  If you plot the chi2, AIC, and k values verses
> >> > iteration number, like I did in my thesis in figures 7.3 and 7.4
> >> > (http://eprints.infodiv.unimelb.edu.au/archive/00002799/),
> >> you'll see
> >> > what is happening there.  The plots should help in figuring out
> >> > exactly what is happening.
> >> >
> >> > Regards,
> >> >
> >> > Edward
> >> >
>
>
>
>

_______________________________________________
relax (http://nmr-relax.com)

This is the relax-users mailing list
[email protected]

To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-users

Reply via email to