Hi guys,
thanks a lot for all your interesting replies.
i) How can I get threshold value which the classifier has decided to take
the decision for a particular sample to be in 0 or 1 class in binary
classification using scikit? The whole purpose of my previous questions
were to know about that
On 22/02/15 22:50, Ronnie Ghose wrote:
@Molden who were your supervisors .
http://www.nobelprize.org/nobel_prizes/medicine/laureates/2014/press.html
On Sat, Feb 21, 2015 at 5:57 PM, Sturla Molden
sturla.mol...@gmail.com
mailto:sturla.mol...@gmail.com wrote:
On 21/02/15 23:20,
Yeah ..it's here.
thanks for the explanation.
S.
On Sun, Feb 22, 2015 at 10:57 PM, Andy t3k...@gmail.com wrote:
I think it is. The last character of my answer was _.
That is the variable name to which you assign the thresholds in your code.
On 02/22/2015 01:54 PM, shalu jhanwar wrote:
On 02/22/2015 01:21 PM, shalu jhanwar wrote:
Hi guys,
thanks a lot for all your interesting replies.
i) How can I get threshold value which the classifier has decided to
take the decision for a particular sample to be in 0 or 1 class in
binary classification using scikit? The whole purpose
Hi Andy,
thanks for the reply. I guess your second question's ans. is not completely
displayed in the email. Could you please re-post it?
thanks!
S.
On Sun, Feb 22, 2015 at 10:46 PM, Andy t3k...@gmail.com wrote:
On 02/22/2015 01:21 PM, shalu jhanwar wrote:
Hi guys,
thanks a lot for all
I think it is. The last character of my answer was _.
That is the variable name to which you assign the thresholds in your code.
On 02/22/2015 01:54 PM, shalu jhanwar wrote:
Hi Andy,
thanks for the reply. I guess your second question's ans. is not
completely displayed in the email. Could you
Nice one! I gather that you were working for the Mosers.
G
On Sat, Feb 21, 2015 at 11:57:27PM +0100, Sturla Molden wrote:
On 21/02/15 23:20, Sturla Molden wrote:
I have discovered a truly marvelous proof, which this margin is too
narrow to contain. ;-)
A more bizarre story...
Last time
On 21/02/15 23:20, Sturla Molden wrote:
I have discovered a truly marvelous proof, which this margin is too
narrow to contain. ;-)
A more bizarre story...
Last time I said something like that was in 2004, when a postdoc, a
fellow PhD student and I had found some strange hexagonal patterns in
On 20/02/15 18:34, Gael Varoquaux wrote:
On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu jhanwar wrote:
i) Can I do it with more features (I have 16 features)?
How do you visualize a 16-features space?
I think I know of a rather general solution to this problem. But I don't
think I should
I have discovered a truly marvelous proof, which this margin is too
narrow to contain. ;-)
Sturla
On 21/02/15 22:58, Vlad Niculae wrote:
Apologies in advance, but this fits so well, I couldn’t help myself.
A Mathematician and an Engineer attend a lecture by a Physicist. The topic
concerns
Apologies in advance, but this fits so well, I couldn’t help myself.
A Mathematician and an Engineer attend a lecture by a Physicist. The topic
concerns Kulza-Klein theories involving physical processes that occur in spaces
with dimensions of 9, 12 and even higher. The Mathematician is sitting,
On 20/02/15 14:29, shalu jhanwar wrote:
Hi guys,
I am using SVM and Random forest classifiers from scikit learn. I wonder
is it possible to plot the decision boundary of the model on my own
training dataset so that I can have a feeling of the data? Is there any
in-built example available in
Hi guys,
I am using SVM and Random forest classifiers from scikit learn. I wonder is
it possible to plot the decision boundary of the model on my own training
dataset so that I can have a feeling of the data? Is there any in-built
example available in Scikit which I can refer to view let's say
Hi Sebastian,
Thanks a lot for your reply. Here in the examples, only 2 features are used
to generate these plots.
i) Can I do it with more features (I have 16 features)?
ii) I wanna see the decision boundary of my training and testing dataset to
see if the model is fine or it's overfitted on
Hi, Shalu,
One example for plotting decision regions would be here:
http://scikit-learn.org/stable/auto_examples/plot_classifier_comparison.html
It's basically a brute force approach: You define 2D grid of points and then
classifier each of those points. Also, the downside is that you can only
On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu jhanwar wrote:
i) Can I do it with more features (I have 16 features)?
How do you visualize a 16-features space?
G
--
Download BIRT iHub F-Type - The Free
Generally I do PCA and can plot the reduced dimension of the data (PC1 and
PC2). Here I'm interested in knowing the boundary decision of the
classifier.
S.
On Fri, Feb 20, 2015 at 6:34 PM, Gael Varoquaux
gael.varoqu...@normalesup.org wrote:
On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu
i) That would be quite a challenge for the human brain: In the best case you
have a hyperplane in 16 dimensions :). How can be put that into a scatter
plot!? :)
ii + iii) When I understand correctly, you want to get an idea about the
generalization error? The simplest way would maybe to look
iii) What would be the best way to know whether the model is fine or
overfitted according to your experience?
Take a look at this answer by Lars - http://stackoverflow.com/a/12254521/4016687
--
Download BIRT iHub F-Type
19 matches
Mail list logo