Thank you for the answer. It really helps! On Sat, Dec 15, 2012 at 3:15 AM, Ted Dunning <[email protected]> wrote:
> The point is that the AdaptiveLogisticRegression computes average > performance over some number of training examples and then uses that > average performance to adapt the algorithm hyper-parameters to get the best > average results on held-out data. You want a short window so that the > average responds quickly, but you want a long window so that the average is > less noisy. > > On Fri, Dec 14, 2012 at 11:03 AM, Yang Zhou <[email protected]> wrote: > > > The parameter of setInterval(int) is interval. And windowSize is the > > parameter of setAveragingWindow(int). Sorry about the mistake. > > > > On Sat, Dec 15, 2012 at 2:52 AM, Yang Zhou <[email protected]> wrote: > > > > > I read the source code and know that windowSize for setInterval(int > > > windowSize) is the number of training examples to use in optimization. > > But > > > I can still not understand what's the exact meaning of windowSize for > > > setAveragingWindow(int windowSize) . Would you mind telling more about > > > that? Thanks! > > > > > > > > > On Sat, Dec 15, 2012 at 2:44 AM, Ted Dunning <[email protected] > > >wrote: > > > > > >> I would recommend testing with OnlineLogisticRegression first. > > >> > > >> The AdaptiveLogisticRegression has a tendency to freeze on sub-optimal > > >> parameter values sooner than it should. > > >> > > >> In any case, the averaging window for ALR should be set fairly long > and > > >> should be at least 10% of your data set. If your dataset is small, I > > >> would > > >> recommend using OLR instead. > > >> > > >> > > >> On Fri, Dec 14, 2012 at 10:41 AM, Yang Zhou <[email protected]> > > wrote: > > >> > > >> > Hi, > > >> > > > >> > I try to train a AdaptiveLogisticRegression, but have no idea how > > large > > >> > windowSize should be when calling setInterval(int windowSize) and > > >> > setAveragingWindow(int windowSize) of AdaptiveLogisticRegression. > Any > > >> > suggestion? Thanks! > > >> > > > >> > > > > > > > > >
