In case it may help any of you, I simply ended up using linearSVC.
Its .decision_function() gives one value for each class and compatible with
.predict() function!
Plus, it turned out to be faster than svm.svc (as always) and more accurate
for my dataset.
Thanks again for your help
On Tue, May
Well, if the naming's alright, I think PR#1962 should be a no-brainer. I
also added a deprecation warning in case SelectorMixin was used externally.
On Tue, May 14, 2013 at 7:00 AM, Lars Buitinck wrote:
> 2013/5/13 Joel Nothman :
> > How about SelectorMixin and _LearntSelectorMixin respectively
2013/5/13 Joel Nothman :
> How about SelectorMixin and _LearntSelectorMixin respectively?
SelectKBest and friends also learn.
> On Tue, May 14, 2013 at 6:54 AM, Joel Nothman
> wrote:
>>
>> That seems a happy solution to me! Should the new SelectorMixin be part of
>> the public API?
I haven't se
How about SelectorMixin and _LearntSelectorMixin respectively?
On Tue, May 14, 2013 at 6:54 AM, Joel Nothman
wrote:
> That seems a happy solution to me! Should the new SelectorMixin be part of
> the public API?
>
> (I'm not sure about Clf on _ClfSelectorMixin when it's mixed into
> regressors to
That seems a happy solution to me! Should the new SelectorMixin be part of
the public API?
(I'm not sure about Clf on _ClfSelectorMixin when it's mixed into
regressors too. Or aren't we so particular? Really what we mean is
_FeatureSelectorFromLearntModelParametersMixin, but terser.)
On Mon, May
On 05/13/2013 05:49 PM, Matthias Ekman wrote:
>> Unfortunately, this doesn't solve my problem. I'm still getting the same
>> error for an array with the dimension 2000x15. I'm very surprised as
>> this is not exactly a very large dataset. Are you sure there are no other
>> settings (maybe on
>> Unfortunately, this doesn't solve my problem. I'm still getting the same
>> error for an array with the dimension 2000x15. I'm very surprised as
>> this is not exactly a very large dataset. Are you sure there are no other
>> settings (maybe on a system level) that might interfere with this i
PR welcome on this. I think Jaques you have it ready.
Best,
Alex
On Tue, May 7, 2013 at 11:42 AM, Jaques Grobler wrote:
>
>
>>
>> 2013/5/7 James D Jensen
>> Thanks. You mentioned that I could "[add] positive to LassoCV and [pass]
>> it to the Lasso estimators used in the cross-val." In the dire
In fact, the limit is a signed 32-bit int, so 2147483647 and you have
24
bytes.
On Mon, May 13, 2013 at 8:01 AM, John Benediktsson wrote:
> Unfortunately, this doesn't solve my problem. I'm still getting the same
>> error for an array with the dimension 2000x15. I'm very surprised
>
> Unfortunately, this doesn't solve my problem. I'm still getting the same
> error for an array with the dimension 2000x15. I'm very surprised as
> this is not exactly a very large dataset. Are you sure there are no other
> settings (maybe on a system level) that might interfere with this iss
On 05/13/2013 04:48 PM, Matthias Ekman wrote:
Hi Andy,
thanks for your comment. I didn't know about the pre_dispatch
parameter before. Here is my PR that adds the parameter to
``cross_val_score``
https://github.com/scikit-learn/scikit-learn/pull/1961
Unfortunately, this doesn't solve my prob
Hi Andy,
thanks for your comment. I didn't know about the pre_dispatch parameter
before. Here is my PR that adds the parameter to ``cross_val_score``
https://github.com/scikit-learn/scikit-learn/pull/1961
Unfortunately, this doesn't solve my problem. I'm still getting the same
error for an array
On Mon, May 13, 2013 at 03:07:12PM +0200, Lars Buitinck wrote:
> Since SelectorMixin is not exported by sklearn.feature_selection and
> is not documented anywhere, we can regard it as a private part of the
> API; if users have to read the source to find it, it's fair game. I
> suggest we rename tha
2013/5/13 Joel Nothman :
> There I name it FeatureSelectionMixin, but there is already a
> sklearn.feature_selection.selector_mixin.SelectorMixin for general
> estimators which also assign features importances (or they can be inferred
> from coefs_). This naming is potentially confusing, so I was w
Much of the functionality now in
sklearn.feature_selection.univariate_selection._BaseFilter [1] applies more
generally to all feature selection: extracting a column subset of some X
given some mask (transform), reversing that operation (inverse_transform)
and reporting the mask itself (get_support)
Hi Ronnie,
As other people say, Theano won't be added as a dependency to the scikit.
However, the code is fairly simple and I guess that it would not be
difficult to make it work using Theano. Is you do so, you may consider
doing a PR to Theano rather than the scikit.
Alexandre.
On Sun, May 12,
Hi Evan,
On Wed, 2013-05-08 at 14:03 -0400, Evan Molinelli wrote:
> At the end of the day it seems to work and can retrieve the list of
> feature weights. HOWEVER, i am unable to find out the optimal value of
> alpha (the ridge parameter) that was used.
>
>
> RidgeCVInstance.coef_ just giv
Have you tried .alpha_?
Cheers,
- Joel
On Thu, May 9, 2013 at 4:03 AM, Evan Molinelli wrote:
> Hi all,
>
> I'm trying to perform a simple ridge regression of protein measurements on
> drug-synergy scores. I also want to use the ridge with the built in
> LeaveOneOut cross validation.
>
> At th
Hi all,
I'm trying to perform a simple ridge regression of protein measurements on
drug-synergy scores. I also want to use the ridge with the built in
LeaveOneOut cross validation.
At the end of the day it seems to work and can retrieve the list of feature
weights. HOWEVER, i am unable to find o
That should be 'Github's "Merge pull request" button'. Sleepy.
On Mon, May 13, 2013 at 6:00 PM, Joel Nothman
wrote:
> It's not quite that involved.
>
> Firstly, I've factored out the squash-all part into a separate `git
> squash` command (http://github.com/jnothman/git-squash), so the rest can
>
It's not quite that involved.
Firstly, I've factored out the squash-all part into a separate `git squash`
command (http://github.com/jnothman/git-squash), so the rest can be done by
hand and this used when it's appropriate.
Secondly, I've discovered it can be simplified since git 1.7.8 which
intr
Hi Joel,
Moving this on the list.
Thanks a lot for the script. It's more involved than I would have naively
thought :).
In general, I am not for squashing everything together when merging in a
PR. I believe that commits that fit together should be merged together,
however, I also believe that we
22 matches
Mail list logo