Thanks a lot for the explanation.
So do I see this right, that I would need to calculate for each pair of
feature vectors the KL divergence?
I have already tried to use a pipeline calculating an additive chi
squared followed by a linear SVC. This boosts my results a bit. But I am
still staying
On Mon, May 14, 2012 at 05:00:54PM +0200, Philipp Singer wrote:
> Thanks, that sounds really promising.
>
> Is there an implementation of KL divergence in scikit-learn? If so, how can I
> directly use that?
I don't believe there is, but it's quite simple to do yourself. Many
algorithms in scikit
Thanks, that sounds really promising.
Is there an implementation of KL divergence in scikit-learn? If so, how can I
directly use that?
Regards,
Philipp
> Hi Philipp,
>
> you could try a nearest neighbors approach and use KL-divergence as
> your "distance metric"**
>
> best,
> Peter
>
> ** KL-
Hi Philipp,
you could try a nearest neighbors approach and use KL-divergence as
your "distance metric"**
best,
Peter
** KL-divergence is not a proper metric but it might work
2012/5/14 :
> I would try using a chi squared Kernel. You can Start by using the
> approximation provided in sklearn.
I would try using a chi squared Kernel. You can Start by using the
approximation provided in sklearn.
Cheers, andy
--
Diese Nachricht wurde von meinem Android-Mobiltelefon mit K-9 Mail gesendet.
Philipp Singer schrieb:
Hey there!
I am currently trying to classify a dataset which has the fol