Re: [Scikit-learn-general] OpenOpt and SVM

2012-10-02 Thread Dmitrey
Joseph Turian writes: > > >> Can anyone compare Theano and openopt for automatic differentiation? > > > > I guess I could, but it will hardly be objective been done by its developer. > > I don't mind a biased perspective, please offer it to give your > feedback on what you emphasized. I

Re: [Scikit-learn-general] OpenOpt and SVM

2012-10-02 Thread Dmitrey
Joseph Turian writes: > > >> Can anyone compare Theano and openopt for automatic differentiation? > > > > I guess I could, but it will hardly be objective been done by its developer. > > I don't mind a biased perspective, please offer it to give your > feedback on what you emphasized. I don'

Re: [Scikit-learn-general] OpenOpt and SVM

2012-10-01 Thread Joseph Turian
>> Can anyone compare Theano and openopt for automatic differentiation? > > I guess I could, but it will hardly be objective been done by its developer. I don't mind a biased perspective, please offer it to give your feedback on what you emphasized. ---

Re: [Scikit-learn-general] OpenOpt and SVM

2012-10-01 Thread Dmitrey
Joseph Turian writes: > > Can anyone compare Theano and openopt for automatic differentiation? I guess I could, but it will hardly be objective been done by its developer. Sebastian Walter had compared some Python AD tools http://forum.openopt.org/viewtopic.php?id=316 , mb you could ask him.

Re: [Scikit-learn-general] OpenOpt and SVM

2012-10-01 Thread federico vaggi
There's also the excellent uncertainties package (http://pypi.python.org/pypi/uncertainties/) for implicit automatic differentiation. On Mon, Oct 1, 2012 at 7:44 AM, Joseph Turian wrote: > Can anyone compare Theano and openopt for automatic differentiation? > > On Fri, Sep 28, 2012 at 2:36 PM, Dm

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-30 Thread Joseph Turian
Can anyone compare Theano and openopt for automatic differentiation? On Fri, Sep 28, 2012 at 2:36 PM, Dmitrey wrote: > Hi all, > nice to hear about another one OpenOpt application. > >> For small non linear problems having an exact SVM/SVR solver >> (not approximated) is very useful IMHO. > > I'm

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-30 Thread Dmitrey
Paolo Losi writes: > > I'm looking forward to the new enhancement. Have you got any links about them? > > Paolo He informed me about the enhancement via phone. In internet currently I found only one link http://www.aticmd.md/wp-content/uploads/2012/03/A_42_Jurbenco_2_.pdf, but: * At first you

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-30 Thread Paolo Losi
Hi Dmitrey, On Fri, Sep 28, 2012 at 8:36 PM, Dmitrey wrote: > > > For small non linear problems having an exact SVM/SVR solver > > (not approximated) is very useful IMHO. > > I'm not sure what does this mean "For small non linear problems having > an exact SVM/SVR solver (not approximated) is ve

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Dmitrey
Hi all, nice to hear about another one OpenOpt application. > For small non linear problems having an exact SVM/SVR solver > (not approximated) is very useful IMHO. I'm not sure what does this mean "For small non linear problems having an exact SVM/SVR solver (not approximated) is very useful IM

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 3:48 PM, Mathieu Blondel wrote: > # If you do subgradient descent, you can use non-smooth losses. In the > paper I mentioned, the author is using Newton's method, which is why he's > using differentiable losses. > Exactly, In fact ralg supports non smooth functions [1] via

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Mathieu Blondel
On Fri, Sep 28, 2012 at 10:36 PM, Paolo Losi wrote: > > My openopt experimentation was motivated exactly by that paper. > Interesting! I hadn't read your source code so I was assuming you were solving a QP :) # If you do subgradient descent, you can use non-smooth losses. In the paper I mention

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
Hi Mathieu, On Fri, Sep 28, 2012 at 3:16 PM, Mathieu Blondel wrote: > If you can afford to store the entire kernel matrix in memory, "training > support vector machines in the primal" [*] seems like the way to go for me. My openopt experimentation was motivated exactly by that paper. The reaso

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 11:37 AM, federico vaggi wrote: > I would be very interested. Here you have the gist federico: https://gist.github.com/3799831 Paolo -- Got visibility? Most devs has no idea what their productio

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Mathieu Blondel
If you can afford to store the entire kernel matrix in memory, "training support vector machines in the primal" [*] seems like the way to go for me. It's really easy to implement in Python + Numpy (OpenOPT cannot be added to scikit-learn). It's restricted to the squared hinge loss (what Lin et al.

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 2:54 PM, Andreas Mueller wrote: > Am 28.09.2012 14:50, schrieb Paolo Losi: > > Hi Olivier, > > On Fri, Sep 28, 2012 at 2:28 PM, Olivier Grisel > wrote: > >> What about the memory usage? Do you need to precompute the kernel >> matrix in advance or do you use some LRU cache

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 2:32 PM, Andreas Mueller wrote: > Dear All. > Please put on sunglasses before opening the openopt webpage. > :-) Also: I think the way forward with SVMs is using low rank approximations of > the kernel matrix. > For "small" datasets, SMO or the version in LASVM seem to

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Andreas Mueller
Am 28.09.2012 14:50, schrieb Paolo Losi: Hi Olivier, On Fri, Sep 28, 2012 at 2:28 PM, Olivier Grisel mailto:[email protected]>> wrote: What about the memory usage? Do you need to precompute the kernel matrix in advance or do you use some LRU cache for columns as in libsvm?

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
Hi Olivier, On Fri, Sep 28, 2012 at 2:28 PM, Olivier Grisel wrote: > What about the memory usage? Do you need to precompute the kernel > matrix in advance or do you use some LRU cache for columns as in > libsvm? > unlike libsvm I definitely precompute the kernel matrix. Is it the same scalabili

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Andreas Mueller
Dear All. Please put on sunglasses before opening the openopt webpage. Also: I think the way forward with SVMs is using low rank approximations of the kernel matrix. For "small" datasets, SMO or the version in LASVM seem to work very well imho. Cheers, Andy Am 28.09.2012 10:53, schrieb Paol

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Olivier Grisel
What about the memory usage? Do you need to precompute the kernel matrix in advance or do you use some LRU cache for columns as in libsvm? Is it the same scalabilité w.r.t. n_samples as libsvm? -- Olivier -- Got visibil

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread federico vaggi
I would be very interested. OpenOpt looks very good, it just has patchy documentation, so some well commented examples would be welcome. Perhaps offer to share the documentation with OpenOpt's developer? Federico On Fri, Sep 28, 2012 at 10:53 AM, Paolo Losi wrote: > Hi all, > > I'm following

[Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
Hi all, I'm following the thread about libsvm... I just wanted to share some impressive result I got by solving svm with OpenOpt [1]. My main use case was to try different loss functions for regression (libsvm only provides epsilon insensitive). In a couple of hours I succeeded in implementing