On Tue, Feb 14, 2012 at 06:03:44PM -0500, Ian Goodfellow wrote:
> I've observed that SVMs fit with sklearn consistently get around 5
> percentage points lower accuracy than equivalent SVMs fit with Adam
> Coates' SVM implementation based on minFunc. Am I overlooking some
> basic usage issue (eg too loose of a default convergence criterion),
> or is this likely to be a defect in the underlying libsvm
> implementation?
> 
> To demonstrate, run svm_comparison.m in matlab then svm_comparison.py in 
> python.
> You'll need Adam Coates' code from
> http://www.stanford.edu/~acoates/papers/sc_vq_demo.tgz  for train_svm
> to work.

There's a bug in svm_train.py: you don't squeeze out the extra dimension of
new_y and the == broadcasts the two vectors against each other into a matrix.
With that fix, it's up to 0.68, 0.68 with github "0.10.X" branch.

By the way, something is bonkers with the current master. Andreas suggested
to me that it might be that the default behaviour of scale_C has changed, but
even with scale_C=False I am getting 0.0 train accuracy, 0.0 test accuracy
with the same code. 

David

------------------------------------------------------------------------------
Virtualization & Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to