Take for example two frogs who are trying to catch a fly.  Both frogs notice
that the fly lands on a lilypad once an hour.  So both sit on the lily pad
and wait.  One decides to chase the fly, but the fly moves much to fast for
him.  Eventually the fly makes it back to the lilypad, where the patient
frog has been waiting patiently.

Let us not forget that curve fitting to the past does not guarantee results
in the future!  Re-optimizing over short periods of time makes your results
even less statistically relevant.



On Sat, Dec 4, 2010 at 2:22 PM, Astor <[email protected]> wrote:

> Parameters may mean revert very quickly, very slowly or not at all. In case
> of the latter two, you can have some serious losses before realizing that
> the old parameters are no longer valid and re-optimization is needed.
> Limited re-optimization would allow the parameters to drift away from the
> old "global" values if the change is persistent but would not cause any harm
> if they mean revert. So it is a form of parameter monitoring and insurance.
>
> By limited re-optimization I mean searching not the entire range of the
> parameter values, but only the values within 5% of the most recent value.
> This way, on any of the re-optimizations parameters will not change more
> than five percent from the prior value, - which should not hurt even if the
> parameters quickly mean revert. On the other hand, if there is a sustained
> change in parameter values, after 10 or more re-optimizations the new
> parameters will be far closer to reality than the old global ones.
>
> Perhaps the class could use a call something like:
>
>
>
> addParam(
> *FAST_PERIOD*, .95*Old_FastValue, 1.05*Old_FastValue, .01*Old_FastValue,
> Old_FastValue)
>
> addParam(*SLOW_PERIOD*
> , .95*Old_SlowValue, 1.05*Old_SlowValue, .01*Old_SlowValue, Old_SlowValue)
>
> This would only require evaluating 10 combinations for each parameter or
> 100 for both, so it could run very fast and remain within 5% of the prior
> value.
>
>  ------------------------------
> *From:* Eugene Kononov <[email protected]>
> *To:* [email protected]
> *Sent:* Sat, December 4, 2010 11:51:57 AM
>
> *Subject:* Re: [JBookTrader] Dynamic Parameter Optimization
>
> On Sat, Dec 4, 2010 at 11:22 AM, Astor <[email protected]> wrote:
>
>>   Yes. This discussion thread is exactly what I am thinking.
>>
>
> Ok, yes, nothing stops us from experimenting. One thing which makes me
> reluctant to put my time and efforts into coding it is what I consider a
> very weak hypothesis on which the walk-forward optimization is based. That
> hypothesis is that the best optimization parameters for the recent market
> period are superior to the more "global" optimization parameters, when
> applied to the future period. Markets do change, indeed, but is there any
> evidence that these new formations of behavior persist, rather than revert
> back to what they have been in the past?
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "JBookTrader" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected]<jbooktrader%[email protected]>
> .
> For more options, visit this group at
> http://groups.google.com/group/jbooktrader?hl=en.
>
>  --
> You received this message because you are subscribed to the Google Groups
> "JBookTrader" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected]<jbooktrader%[email protected]>
> .
> For more options, visit this group at
> http://groups.google.com/group/jbooktrader?hl=en.
>

-- 
You received this message because you are subscribed to the Google Groups 
"JBookTrader" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/jbooktrader?hl=en.

Reply via email to