Re: [scikit-learn] Why is subset invariance necessary for transfom()?

2020-01-20 Thread Joel Nothman
I think allowing subset invariance to not hold is making stronger assumptions than we usually do about what it means to have a "test set". Having a transformation like this that relies on test set statistics implies that the test set is more than just selected samples, but rather that a large colle

[scikit-learn] Why is subset invariance necessary for transfom()?

2020-01-20 Thread Charles Pehlivanian
Not all data transformers have a transform method. For those that do, subset invariance is assumed as expressed in check_methods_subset_invariance(). It must be the case that T.transform(X)[i] == T.transform(X[i:i+1]), e.g. This is true for classic projections - PCA, kernel PCA, etc., but not for s

Re: [scikit-learn] ask a question about weights for features in svc with rbf kernel

2020-01-20 Thread Rujing Zha
Hi Guillaume Is it OK for rbf kernel?As the document said: Weights assigned to the features (coefficients in the primal problem). This is only available in the case of a linear kernel. At 2020-01-20 20:30:53, "Guillaume Lemaître" wrote: You can look at the attribute coef_ once yo

Re: [scikit-learn] ask a question about weights for features in svc with rbf kernel

2020-01-20 Thread Guillaume Lemaître
You can look at the attribute coef_ once your model is fitted.  Sent from my phone - sorry to be brief and potential misspell.

[scikit-learn] ask a question about weights for features in svc with rbf kernel

2020-01-20 Thread Rujing Zha
Hi experts and users, I am going to extact the pattern of svc. But I do not know how to extract weights for each feature using this svc classifiers with kernel of rbf function. Thank you. Rujing ___ scikit-learn mailing list scikit-learn@pyth