On Tue, Jan 17, 2012 at 2:56 PM, Shiqiao Du (杜 世橋)
<[email protected]> wrote:
> I'd like to ask why `BayesianRidge` and `ARDRegression` do not use
> marginal log likelihood (MLL) but learned coefficients to check
> convergence when fitting.
> I know that most iterative algorithms must have some objective
> function by which the convergence is checked.
> In Bayesian inference, like variational learning, the objective function is 
> MLL.
> Are there any reason not to use  MLL?

I think we could add an option to support both types of convergence
condition. Checking for parameter change is fast but has no formal
guarantee (it's an heuristic) while MLL has but is slow.

Mathieu

------------------------------------------------------------------------------
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to