AFAIK for question 2, there is no built-in method to account for that
problem.
At right now, we can only perform one type of regularization.
However, the elastic net implementation is just underway.
You can refer this topic for further discussion.
https://issues.apache.org/jira/browse/SPARK-1543


2014-07-17 2:08 GMT+08:00 fjeg <francisco.gime...@gmail.com>:

> 1) Okay, to clarify, there is *no* way to regularize logistic regression in
> python (sorry if I'm repeating your answer).
>
> 2) This method you described will have overflow errors when abs(margin) >
> 750. Is there a built-in method to account for this? Otherwise, I will
> probably have to implement something like this:
>
> http://lingpipe-blog.com/2012/02/16/howprevent-overflow-underflow-logistic-regression
>
> Also, another question about the Scala implementation:
> Can we only do one type of regularization? Is there any way to perform
> elastic net which is a combination of L1 and L2?
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-Regularized-logistic-regression-in-python-tp9780p9963.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to