hi,

> I'd like to ask why `BayesianRidge` and `ARDRegression` do not use
> marginal log likelihood (MLL) but learned coefficients to check
> convergence when fitting.
> I know that most iterative algorithms must have some objective
> function by which the convergence is checked.
> In Bayesian inference, like variational learning, the objective function is 
> MLL.
> Are there any reason not to use  MLL?

to be honest I would say no. It's mostly done this way for historical reasons.
This code has not changed much since our first sprints.

> And also, is the learning algorithm of `BayesianRidge` and
> `ARDRegression` a kind of variational learning?
> If so, the MLL is ensured to increase upon learning.
> However, the MLL of ARDRegression
> (http://scikit-learn.org/stable/_images/plot_ard_3.png) did decrease.
> Is this a bug or did I misunderstand something?

I am not sure but I remember skipping a test due to a similar problem.
See test_bayes.py. This code needs some love and it's great if you dig
a bit into it. The notations and algorithms should follow Bishop's book.

Alex

------------------------------------------------------------------------------
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to