I set up a 5 second database today and let the number of bars default to 1000, duh. I am trying to build a long term database for "forward" testing using BarReplay. When I tried to use the data there wasn't too much there, duh again. I changed the database settings to 200000, 84 days, but when I hit OK it tells me there will be performance degradation. I didn't think AB loaded all the bars but only enough to fill the indicators. Why are we getting this message, what does it really mean and why does the database default to so small a number of bars?
I had wondered why I kept losing data. It seems AB truncates the data in the database to the number of bars specified on the database settings. Is that the max size of the database? I always thought, especially with the message about performance, that this was the number of bars that would be loaded. I did not know it was the max database size and that AB would truncate the data when that number was exceeded. Before it does that there should be some warning message that the data will be truncated unless the database size is changed. Data is important and should not simply be deleted. It would be much clearer if the parameter said database size per symbol. That would give us a clue as to what is happening when we start losing data. And maybe the message text should be changed if the size of the database does not impact performance. What needs to be done so I can archive lots of data but not impact performance? I surely don't want it to load a 100,000 or a million bars when I open a symbol. With terabyte hard drives archiving lots of data is pretty cheap these days. Back in the early days of AB 40 GB drives were standard. No longer. Barry
