2012/6/22 Kai Kuehne <[email protected]>:
> I didn't explain the thing I don't understand.
> I try again...
>
> In this first picture:
> http://jakevdp.github.com/_images/plot_bias_variance_examples_3.png
>
> Both training and cross validation error start high, so it's a high bias
> if the degree is small.
>
> On the second picture:
> http://jakevdp.github.com/_images/plot_bias_variance_examples_4.png
>
> On the left side pictured is d = 1, so a low degree.
> The cross validation error starts high, but ... and that's the thing
> I both don't understand and cannot reproduce: The training error
> starts small. The first diagram states that both start high for small
> degrees...
>

I don't really know but I think those curves should be recomputed to
display the mean across 10 runs of a 10-folds CV along with the
standard error of the means as error bars like I almost did on my
graphs (I used the standard deviation instead).

-- 
Olivier
http://twitter.com/ogrisel - http://github.com/ogrisel

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to