It looks like the initial intercept term is 1 only in the addIntercept
&& numOfLinearPredictor == 1 case. It does seem inconsistent; since
it's just an initial weight it may not matter to the final converged
value. You can see a few notes in the class about how
numOfLinearPredictor == 1 is handled a bit inconsistently and how a
smarter choice of initial intercept could help convergence. So I don't
know if this rises to the level of bug but I don't know that the
difference is on purpose.

On Thu, Feb 5, 2015 at 5:40 PM, jamborta <jambo...@gmail.com> wrote:
> hi all,
>
> I have been going through the GeneralizedLinearAlgorithm to understand how
> intercepts are handled in regression. Just noticed that the initial setting
> for the intercept is set to one (whereas the initial setting for the rest of
> the coefficients is set to zero) using the same piece of code that adds the
> 1 in front of each line in the data. Is this a bug?
>
> thanks,
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/one-is-the-default-value-for-intercepts-in-GeneralizedLinearAlgorithm-tp21525.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to