The more data you have, the closer each run will be. How much data do you have?

On Thu, Aug 30, 2012 at 2:49 PM, Salman Mahmood <[email protected]> wrote:
> I have noticed that every time I train and test a model using the same data 
> (in SGD algo), I get different confusion matrix. Meaning, if I generate a 
> model and look at the confusion matrix, it might say 90% correctly classified 
> instances, but if I generate the model again (with the SAME data for training 
> and testing as before) and test it, the confusion matrix changes and it might 
> say 75% correctly classified instances.
>
> Is this a desired behavior?



-- 
Lance Norskog
[email protected]

Reply via email to