Thanks for the paper you sent. It helps.

On Mon, Jul 1, 2013 at 3:00 PM, Ted Dunning <[email protected]> wrote:

> I believe that there is a link to this paper in the code.  If not file a
> jira.
>
>
> http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.177.3514&rep=rep1&type=pdf
>
> Note also that stochastic gradient descent is a very common algorithm for
> large scale logistic regression.  You can find the basics anywhere with a
> simple google search.
>
>
> Sent from my iPhone
>
>
> On Jul 1, 2013, at 11:59, qiaoresearcher <[email protected]> wrote:
>
> > Ted,
> >
> > Thanks, but I have looked into the code and found it not very clear to
> me.
> > For a given cost function, there are many ways to optimize the cost
> > function, eg, gradient descent, newton method. It would be much better if
> > the authors of Mahout code can simply put one line in some place to say:
> > the implementation is based on XXX  paper or XXX book chapter or XXX
> > webpage. Otherwise it is not easy for the users to figure out what update
> > rule is used in the code...
> >
> > Regards,
> >
> > On Mon, Jul 1, 2013 at 3:22 AM, Ted Dunning <[email protected]>
> wrote:
> >
> >> Follow into the regression code itself and check the references.
> >>
> >>
> >> On Fri, Jun 28, 2013 at 3:35 PM, qiaoresearcher <
> [email protected]
> >>> wrote:
> >>
> >>> The logistic regression code is difficult to follow: the trainlogistic
> >> and
> >>> runlogistic part
> >>>
> >>> how the likelihood is calculated, how the weights is updated, etc
> >>>
> >>> does anyone knows who write the mahout logistic regression code? what
> are
> >>> the reference on logistic regression algorithm he was using to write
> the
> >>> code?
> >>>
> >>> thanks,
> >>
>

Reply via email to