Am 08.08.2012 15:48 schrieb "amir rahimi" <[email protected]>:
>
> Thanks for the fast response.
>
> to JP: It works for me using gcc and g++ on 32-bit Mac and Linux! :)
>
> J. Friedman in the paper "Greedy Function Approximation: A Gradient
Boosting Machine" has mentioned the M-regression algorithm which is a
gradient boosting regression method with huber loss function.
>
Amir,
M-regression has been added recently - please take a look at the current
master and the development docs.
Best,
Peter
> Adding these features is generally useful for the users and makes the
library a unified framework for machine learning. I need structured output
svm for plane parameter estimation from 3d depth points and gradient
boosting for a similar task but want to minimize my own loss function.
>
>
>
>
> On Wed, Aug 8, 2012 at 4:20 PM, Andreas Müller <[email protected]>
wrote:
>>
>> Hi Amir.
>> 1) As far as I know, the gradient boosting works only with trees using
deviance or least squares regression.
>> I don't think it should be hard to add other losses, though.
>>
>> 2) There are at the moment no plans to add structured SVMs to the
library. The reason is that structured
>> models usually are very problem specific. It is possible to build
generic frameworks like Joachsim SVMstruct,
>> which works by the user specifying functions for features, inference and
loss-augmented inference,
>> but this doesn't really fit well with the sklearn principle of using
only arrays as data structures and
>> having a simple "fit/predict" interface.
>>
>> What application did you have in mind?
>>
>> In general I would love to have structured learning in sklearn, it just
seems hard to integrate nicely.
>>
>> Btw, I have some structured SVM code to play around in Python, if you
want:
>>
http://peekaboo-vision.blogspot.co.uk/2012/06/structured-svm-and-structured.html
>>
>> Cheers,
>> Andy
>>
>>
>> ----- Ursprüngliche Mail -----
>> Von: "amir rahimi" <[email protected]>
>> An: [email protected]
>> Gesendet: Mittwoch, 8. August 2012 12:40:52
>> Betreff: [Scikit-learn-general] GradientBoostingRegression loss function
and Structured svm
>>
>>
>>
>> Hi all,
>> I have two questions/requests
>>
>> Is there any way to define arbitrary loss function for gradient boosting
regression? e.g. using huber penalty
>> My request is about adding structured output prediction for SVM in the
library. Is there any plan for adding that?
>>
>>
>> --
>> ----------------------------------------------------------------------
>> #include <stdio.h>
>> double d[]={9299037773.178347,2226415.983937417,307.0};
>> main(){d[2]--?d[0]*=4,d[1]*=5,main():printf((char*)d);}
>> ----------------------------------------------------------------------
>>
>>
------------------------------------------------------------------------------
>> Live Security Virtual Conference
>> Exclusive live event will cover all the ways today's security and
>> threat landscape has changed and how IT managers can respond. Discussions
>> will include endpoint security, mobile security and the latest in malware
>> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
>> _______________________________________________
>> Scikit-learn-general mailing list
>> [email protected]
>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>
>>
------------------------------------------------------------------------------
>> Live Security Virtual Conference
>> Exclusive live event will cover all the ways today's security and
>> threat landscape has changed and how IT managers can respond. Discussions
>> will include endpoint security, mobile security and the latest in malware
>> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
>> _______________________________________________
>> Scikit-learn-general mailing list
>> [email protected]
>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
>
>
> --
> ----------------------------------------------------------------------
> #include <stdio.h>
> double d[]={9299037773.178347,2226415.983937417,307.0};
> main(){d[2]--?d[0]*=4,d[1]*=5,main():printf((char*)d);}
> ----------------------------------------------------------------------
>
>
------------------------------------------------------------------------------
> Live Security Virtual Conference
> Exclusive live event will cover all the ways today's security and
> threat landscape has changed and how IT managers can respond. Discussions
> will include endpoint security, mobile security and the latest in malware
> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
> _______________________________________________
> Scikit-learn-general mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and
threat landscape has changed and how IT managers can respond. Discussions
will include endpoint security, mobile security and the latest in malware
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general