There are a few ideas you may want to try with this. 1) See how effective the least squares would be at predicting this, with various sample sizes.
2) For filtering out the noise, consider using the FFT of the data, dropping all the lowest coefficients, and then untransforming the data. Then predict the next value compared to the last known value. I.e. get the offset vector from the last FFT processed point to the next predicted point, and then apply that vector to the last known raw point. See if it helps. 3) You could just use the FFT as a regression prediction method. Extend the projection to t+1 and see what all the frequencies add up to at t+1. If the data is supposed to be periodic (and if it isn't the AI has a hopeless task anyway), this should yield reasonably good results. The same filtering as in 2) can be applied, i.e. dropping the least signifficant coefficients. 4) To enhance representation of the periodic behaviour of the data, consider taking the least squares fit through the data, and adjusting the data so that the least squares lies on the X axis. This will make the data easier to model, especially when the FFT based models are used. Note that this is seriously getting into the realm of using a nuke to crack a nut. We could make this extremely sophisticated, and very CPU expensive, but ultimately, if the time taken to work out the routing is similar (if not greater) to the time it would take to "just try it", it is ultimately pointless. The method has to be computationaly cheap to be effective, as the time it saves has to be considerably greater than the time it loses. Gordan _______________________________________________ devl mailing list [EMAIL PROTECTED] http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl
