I'd like to ask why `BayesianRidge` and `ARDRegression` do not use marginal log likelihood (MLL) but learned coefficients to check convergence when fitting. I know that most iterative algorithms must have some objective function by which the convergence is checked. In Bayesian inference, like variational learning, the objective function is MLL. Are there any reason not to use MLL?
And also, is the learning algorithm of `BayesianRidge` and `ARDRegression` a kind of variational learning? If so, the MLL is ensured to increase upon learning. However, the MLL of ARDRegression (http://scikit-learn.org/stable/_images/plot_ard_3.png) did decrease. Is this a bug or did I misunderstand something? --------------------------------------------------------------- 杜 世橋 (Shiqiao Du) E-mail [email protected] Twitter http://twitter.com/lucidfrontier45 ------------------------------------------------------------------------------ Keep Your Developer Skills Current with LearnDevNow! The most comprehensive online learning library for Microsoft developers is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3, Metro Style Apps, more. Free future releases when you subscribe now! http://p.sf.net/sfu/learndevnow-d2d _______________________________________________ Scikit-learn-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
