Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Dmitrey
Hi all, nice to hear about another one OpenOpt application. > For small non linear problems having an exact SVM/SVR solver > (not approximated) is very useful IMHO. I'm not sure what does this mean "For small non linear problems having an exact SVM/SVR solver (not approximated) is very useful IM

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 3:48 PM, Mathieu Blondel wrote: > # If you do subgradient descent, you can use non-smooth losses. In the > paper I mentioned, the author is using Newton's method, which is why he's > using differentiable losses. > Exactly, In fact ralg supports non smooth functions [1] via

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Mathieu Blondel
On Fri, Sep 28, 2012 at 10:36 PM, Paolo Losi wrote: > > My openopt experimentation was motivated exactly by that paper. > Interesting! I hadn't read your source code so I was assuming you were solving a QP :) # If you do subgradient descent, you can use non-smooth losses. In the paper I mention

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
Hi Mathieu, On Fri, Sep 28, 2012 at 3:16 PM, Mathieu Blondel wrote: > If you can afford to store the entire kernel matrix in memory, "training > support vector machines in the primal" [*] seems like the way to go for me. My openopt experimentation was motivated exactly by that paper. The reaso

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 11:37 AM, federico vaggi wrote: > I would be very interested. Here you have the gist federico: https://gist.github.com/3799831 Paolo -- Got visibility? Most devs has no idea what their productio

Re: [Scikit-learn-general] libsvm PR

2012-09-28 Thread Olivier Grisel
2012/9/26 Andreas Mueller : > > Can you give some insights into why this check is necessary and in > what kind of situations LibSVM fails to converge? I guess it uses > the duality gap for convergence. Is is the case that this is not > a good measure sometimes? I guess this user on stackoverflow w

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Mathieu Blondel
If you can afford to store the entire kernel matrix in memory, "training support vector machines in the primal" [*] seems like the way to go for me. It's really easy to implement in Python + Numpy (OpenOPT cannot be added to scikit-learn). It's restricted to the squared hinge loss (what Lin et al.

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 2:54 PM, Andreas Mueller wrote: > Am 28.09.2012 14:50, schrieb Paolo Losi: > > Hi Olivier, > > On Fri, Sep 28, 2012 at 2:28 PM, Olivier Grisel > wrote: > >> What about the memory usage? Do you need to precompute the kernel >> matrix in advance or do you use some LRU cache

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
On Fri, Sep 28, 2012 at 2:32 PM, Andreas Mueller wrote: > Dear All. > Please put on sunglasses before opening the openopt webpage. > :-) Also: I think the way forward with SVMs is using low rank approximations of > the kernel matrix. > For "small" datasets, SMO or the version in LASVM seem to

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Andreas Mueller
Am 28.09.2012 14:50, schrieb Paolo Losi: Hi Olivier, On Fri, Sep 28, 2012 at 2:28 PM, Olivier Grisel mailto:[email protected]>> wrote: What about the memory usage? Do you need to precompute the kernel matrix in advance or do you use some LRU cache for columns as in libsvm?

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
Hi Olivier, On Fri, Sep 28, 2012 at 2:28 PM, Olivier Grisel wrote: > What about the memory usage? Do you need to precompute the kernel > matrix in advance or do you use some LRU cache for columns as in > libsvm? > unlike libsvm I definitely precompute the kernel matrix. Is it the same scalabili

Re: [Scikit-learn-general] Convolutional / shift invariant dictionary learning on question on Stack Overflow

2012-09-28 Thread Andreas Mueller
Am 28.09.2012 11:50, schrieb Olivier Grisel: > I have no good answer what to reply to: > > http://stackoverflow.com/questions/12636842/shift-invariant-sparse-coding-in-scikit-learn > > Anybody knows whether it would be complicated to implement this? Maybe > by deriving existing classes of sklearn?

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Andreas Mueller
Dear All. Please put on sunglasses before opening the openopt webpage. Also: I think the way forward with SVMs is using low rank approximations of the kernel matrix. For "small" datasets, SMO or the version in LASVM seem to work very well imho. Cheers, Andy Am 28.09.2012 10:53, schrieb Paol

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Olivier Grisel
What about the memory usage? Do you need to precompute the kernel matrix in advance or do you use some LRU cache for columns as in libsvm? Is it the same scalabilité w.r.t. n_samples as libsvm? -- Olivier -- Got visibil

Re: [Scikit-learn-general] Shift-invariant Sparse Coding in scikit-learn?

2012-09-28 Thread Andreas Mueller
Hi Christian. Are you thinking about 1d or 2d convolutions? I am not so familiar with 1d signal processing but there has been some work on convolutional sparse coding for image patches. This is not really planned for sklearn, afaik, though. In computer vision, I think there was no big difference in

[Scikit-learn-general] Convolutional / shift invariant dictionary learning on question on Stack Overflow

2012-09-28 Thread Olivier Grisel
I have no good answer what to reply to: http://stackoverflow.com/questions/12636842/shift-invariant-sparse-coding-in-scikit-learn Anybody knows whether it would be complicated to implement this? Maybe by deriving existing classes of sklearn? Any pointers in the literature on good implementation

Re: [Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread federico vaggi
I would be very interested. OpenOpt looks very good, it just has patchy documentation, so some well commented examples would be welcome. Perhaps offer to share the documentation with OpenOpt's developer? Federico On Fri, Sep 28, 2012 at 10:53 AM, Paolo Losi wrote: > Hi all, > > I'm following

[Scikit-learn-general] OpenOpt and SVM

2012-09-28 Thread Paolo Losi
Hi all, I'm following the thread about libsvm... I just wanted to share some impressive result I got by solving svm with OpenOpt [1]. My main use case was to try different loss functions for regression (libsvm only provides epsilon insensitive). In a couple of hours I succeeded in implementing

[Scikit-learn-general] Shift-invariant Sparse Coding in scikit-learn?

2012-09-28 Thread Christian Vollmer
Hello, there is a nice collection of sparse coding and dictionary algorithms implemented in scikit-learn. However, it seems there are no shift-invariant implementations. Are there plans to include any shift-invariant implementations or is there a way to apply the implemented algorithms in a sh