No.  By linear SVM, I mean SVM that does not use the kernel trick.

This is like logistic regression SGD with a different gradient function.
 Same idea otherwise.  Yes.  This is a convex problem.

On Wed, Feb 29, 2012 at 10:50 PM, Aditya Sarawgi
<[email protected]>wrote:

> So if I understand correctly, I think you mean that instead of having
> multiple layers of svm
> I just have 1 layer that gets the svm of the individual datasets and in
> the reducer I get the
> optimal of all. But is it guaranteed to give a global optima ?
>
> On Thu, Mar 1, 2012 at 1:37 AM, Ted Dunning <[email protected]> wrote:
>
>> For linear SVM, gradient descent is a fine algorithm.  If you go into this
>> work, I would recommend that you implement an all-reduce operation since
>> iterated map-reduce is very inefficient.
>>
>> On Wed, Feb 29, 2012 at 10:30 PM, Aditya Sarawgi
>> <[email protected]>wrote:
>>
>> > Hi,
>> >
>> > Thanks Todd for the pointer. I actually had one more paper in mind, and
>> its
>> > from
>> > the original author of SVM
>> > http://leon.bottou.org/publications/pdf/nips-2004c.pdf
>> >
>> > I think this makes more sense for mapreduce. I am open to other
>> suggestions
>> > or
>> > algorithms.
>> >
>> > Thanks
>> > Aditya Sarawgi
>> >
>> > On Thu, Mar 1, 2012 at 1:04 AM, Todd Johnson <[email protected]>
>> > wrote:
>> >
>> > > The authors of that paper don't believe their algorithm is a good
>> > candidate
>> > > for mapreduce. See:
>> > >
>> >
>> http://groups.google.com/group/psvm/browse_thread/thread/cedd3a6caef0f9c9#
>> > >
>> > > todd.
>> > >
>> > >
>> > >
>> > > On Wed, Feb 29, 2012 at 9:31 PM, Aditya Sarawgi <
>> > [email protected]
>> > > >wrote:
>> > >
>> > > > Hello,
>> > > >
>> > > > I am looking to implement psvm for Mahout as a part of of my
>> > coursework.
>> > > > The reference paper is
>> > > > http://books.nips.cc/papers/files/nips20/NIPS2007_0435.pdf
>> > > > and there is a implementation over
>> http://code.google.com/p/psvm/which
>> > > > uses MPI.
>> > > > Any ideas, pointers are much appreciated.
>> > > >
>> > > > Thanks
>> > > > Aditya Sarawgi
>> > > >
>> > >
>> >
>> >
>> >
>> > --
>> > Cheers,
>> > Aditya Sarawgi
>> >
>>
>
>
>
> --
> Cheers,
> Aditya Sarawgi
>

Reply via email to