Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-20 Thread Andreas Mueller
Btw, for people that are interested in the subject, I just found this paper: http://arxiv.org/pdf/1109.4603 It approximates the rbf kernel using taylor expansions and seems to work better than the Rahimi and Recht method on sparse datasets. My sparse datasets are not sparse, so I didn't really look

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-14 Thread Olivier Grisel
2011/11/14 Andreas Müller : > > If you're still interested in MNIST results: > using gamma=0.03 and C=1 I get 0.9845 with SVC in 12 minutes > (20GB kernel cache, don't know how much was used), > with the same parameters on LinearSVC and 5000 sampled features > I get 0.9783 in ~3 minutes. > Going up

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-14 Thread Andreas Müller
On 11/11/2011 04:45 PM, Andreas Müller wrote: > On 11/11/2011 04:38 PM, Olivier Grisel wrote: >> 2011/11/11 Gael Varoquaux: >>> On Fri, Nov 11, 2011 at 04:11:46PM +0100, Andreas Müller wrote: > If you find that it does work/is useful on real problem, yes! I just started working on it. Atm

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-11 Thread Andreas Müller
On 11/11/2011 04:38 PM, Olivier Grisel wrote: > 2011/11/11 Gael Varoquaux: >> On Fri, Nov 11, 2011 at 04:11:46PM +0100, Andreas Müller wrote: If you find that it does work/is useful on real problem, yes! >>> I just started working on it. Atm I can get 3% error on MNIST using >>> sklearn's SGD.

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-11 Thread Olivier Grisel
2011/11/11 Gael Varoquaux : > On Fri, Nov 11, 2011 at 04:11:46PM +0100, Andreas Müller wrote: >> > If you find that it does work/is useful on real problem, yes! >> I just started working on it. Atm I can get 3% error on MNIST using >> sklearn's SGD. > > Does sound good. Doesn't a grid-searched gau

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-11 Thread Gael Varoquaux
On Fri, Nov 11, 2011 at 04:11:46PM +0100, Andreas Müller wrote: > > If you find that it does work/is useful on real problem, yes! > I just started working on it. Atm I can get 3% error on MNIST using > sklearn's SGD. Does sound good. I find that one of the values of the scikit, and in particular

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-11 Thread Andreas Müller
On 11/11/2011 04:04 PM, Gael Varoquaux wrote: > On Fri, Nov 11, 2011 at 04:00:54PM +0100, Andreas Müller wrote: >> I just implemented the paper "Random Features for Large-Scale Kernel >> Machines". > The Rahimi and Recht one :). It's been on my desktop, waiting for > implementation for something li

Re: [Scikit-learn-general] Random features for approximate kernel maps

2011-11-11 Thread Gael Varoquaux
On Fri, Nov 11, 2011 at 04:00:54PM +0100, Andreas Müller wrote: > I just implemented the paper "Random Features for Large-Scale Kernel > Machines". The Rahimi and Recht one :). It's been on my desktop, waiting for implementation for something like a year. > I was wondering whether this would be i

[Scikit-learn-general] Random features for approximate kernel maps

2011-11-11 Thread Andreas Müller
Hey everybody. I just implemented the paper "Random Features for Large-Scale Kernel Machines". It proposes to use Monte Carlo approximations to the kernel mapping to make explicit kernel maps possible. This is awesome since explicit kernel maps make it possible to use SGD classifiers easily. I was