Hello
Le 20/02/2012 15:41, Andreas a écrit :
Hi Nick.
Afaik no one plans to work on that at the moment.
I was thinking about it some while ago, but then decided not to do it.
By MKL you mean learning the weights of different kernels, right?
Or do you mean inducing some group sparsity on the different kernels?
>From what I heard recently, learning weights did not prove to be very
useful
in practice (on its own). Do you have an example where learning weights
while learning the classifier helped, compared to just adding kernels
(= concatenating features). I would be very interested :)
In my experience, MKL (with L1-regularization) helps in two situations:
(i) when you have both good and bad kernels in the mix,
(ii) when you want to select a kernel parameter by summing the same
kernel for different parameter settings, knowing you have a Dirac of
performance at a specific parameter value.
The common underlying assumption is that, amongst the kernels you sum,
only a few are useful, whereas the others actually *hurt* the performance.
In general, you don't shoot yourself in the foot by voluntarily using
bad kernels and using MKL with a few "ok" kernels can be slightly worse
than just averaging them, while being more computationally demanding.
If you can (e.g. with Gaussian RBF kernels), multiplying kernels makes a
lot of sense (and has a nice performance) as you are doing the tensor
product of feature spaces (RKHSs), which yields an "and" effect: points
close wrt. the product of kernels are close wrt to all kernels. This
makes the opposite assumption of L1-MKL (all kernels are useful). In my
limited experience, this scenario is more likely in general.
Note, that on the other hand, summing the kernels is equivalent to
concatenating the feature spaces (Cartesian product) which I cannot
figure what it means (or if it even has a meaning) in the general RKHS
theory...
About implementing them: I'm not sure that would be an easy thing to do.
I think you would have to dig quite deeply into LibSVM to do this
efficiently.
I agree with Andy on this. Some years ago (at the beginning of the MKL
"trend"), I had to implement the SimpleMKL algorithm. It was quite a bit
of work and I indeed needed to severely hack the insides of LibSVM (you
need to get the gradient of the cost function). There was also some
voodoo engineering involved (line search...). It's doable, but it takes
some effort (for a reward I didn't get...).
My 2 cents,
Adrien
On 02/20/2012 03:36 PM, Nicholas Pilkington wrote:
I was wondering if there were any immediate plans to implement
Multiple Kernel SVMs in sklearn? There is a lots of SVM functionality
and it would be nice to extend to MKL.
If not what would be the best way to get involved in implementing them.
Nick
--
Nicholas C.V. Pilkington
University of Cambridge
------------------------------------------------------------------------------
Try before you buy = See our experts in action!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-dev2
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
------------------------------------------------------------------------------
Try before you buy = See our experts in action!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-dev2
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
------------------------------------------------------------------------------
Try before you buy = See our experts in action!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-dev2
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general