When I create a database and insert 1.5 years of 5 minute bar data, the 
optimization I run takes around 2.5 hours.

When I add more data to the database to end up with 8 years of 5 minute bar 
data, the same optimization takes 4 days+ (exponentially worse).

Even when I restrict the date range to a 1.5 year period in the larger 
database, a single backtest is noticeably longer than the same backtest with 
just the 1.5 years in the database.

Is there anything that I can do to improve the performance of the optimization 
over the larger period?

My plan now is to create a database for each 2 year period and run the 
optimizations on each individually.

Any other ideas?  Any tricks for getting more performance on a large data set?

Reply via email to