Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread shalu jhanwar
Hi guys, thanks a lot for all your interesting replies. i) How can I get threshold value which the classifier has decided to take the decision for a particular sample to be in 0 or 1 class in binary classification using scikit? The whole purpose of my previous questions were to know about that

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread Sturla Molden
On 22/02/15 22:50, Ronnie Ghose wrote: @Molden who were your supervisors . http://www.nobelprize.org/nobel_prizes/medicine/laureates/2014/press.html On Sat, Feb 21, 2015 at 5:57 PM, Sturla Molden sturla.mol...@gmail.com mailto:sturla.mol...@gmail.com wrote: On 21/02/15 23:20,

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread shalu jhanwar
Yeah ..it's here. thanks for the explanation. S. On Sun, Feb 22, 2015 at 10:57 PM, Andy t3k...@gmail.com wrote: I think it is. The last character of my answer was _. That is the variable name to which you assign the thresholds in your code. On 02/22/2015 01:54 PM, shalu jhanwar wrote:

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread Andy
On 02/22/2015 01:21 PM, shalu jhanwar wrote: Hi guys, thanks a lot for all your interesting replies. i) How can I get threshold value which the classifier has decided to take the decision for a particular sample to be in 0 or 1 class in binary classification using scikit? The whole purpose

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread shalu jhanwar
Hi Andy, thanks for the reply. I guess your second question's ans. is not completely displayed in the email. Could you please re-post it? thanks! S. On Sun, Feb 22, 2015 at 10:46 PM, Andy t3k...@gmail.com wrote: On 02/22/2015 01:21 PM, shalu jhanwar wrote: Hi guys, thanks a lot for all

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread Andy
I think it is. The last character of my answer was _. That is the variable name to which you assign the thresholds in your code. On 02/22/2015 01:54 PM, shalu jhanwar wrote: Hi Andy, thanks for the reply. I guess your second question's ans. is not completely displayed in the email. Could you

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-22 Thread Gael Varoquaux
Nice one! I gather that you were working for the Mosers. G On Sat, Feb 21, 2015 at 11:57:27PM +0100, Sturla Molden wrote: On 21/02/15 23:20, Sturla Molden wrote: I have discovered a truly marvelous proof, which this margin is too narrow to contain. ;-) A more bizarre story... Last time

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-21 Thread Sturla Molden
On 21/02/15 23:20, Sturla Molden wrote: I have discovered a truly marvelous proof, which this margin is too narrow to contain. ;-) A more bizarre story... Last time I said something like that was in 2004, when a postdoc, a fellow PhD student and I had found some strange hexagonal patterns in

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-21 Thread Sturla Molden
On 20/02/15 18:34, Gael Varoquaux wrote: On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu jhanwar wrote: i) Can I do it with more features (I have 16 features)? How do you visualize a 16-features space? I think I know of a rather general solution to this problem. But I don't think I should

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-21 Thread Sturla Molden
I have discovered a truly marvelous proof, which this margin is too narrow to contain. ;-) Sturla On 21/02/15 22:58, Vlad Niculae wrote: Apologies in advance, but this fits so well, I couldn’t help myself. A Mathematician and an Engineer attend a lecture by a Physicist. The topic concerns

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-21 Thread Vlad Niculae
Apologies in advance, but this fits so well, I couldn’t help myself. A Mathematician and an Engineer attend a lecture by a Physicist. The topic concerns Kulza-Klein theories involving physical processes that occur in spaces with dimensions of 9, 12 and even higher. The Mathematician is sitting,

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-21 Thread Sturla Molden
On 20/02/15 14:29, shalu jhanwar wrote: Hi guys, I am using SVM and Random forest classifiers from scikit learn. I wonder is it possible to plot the decision boundary of the model on my own training dataset so that I can have a feeling of the data? Is there any in-built example available in

[Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread shalu jhanwar
Hi guys, I am using SVM and Random forest classifiers from scikit learn. I wonder is it possible to plot the decision boundary of the model on my own training dataset so that I can have a feeling of the data? Is there any in-built example available in Scikit which I can refer to view let's say

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread shalu jhanwar
Hi Sebastian, Thanks a lot for your reply. Here in the examples, only 2 features are used to generate these plots. i) Can I do it with more features (I have 16 features)? ii) I wanna see the decision boundary of my training and testing dataset to see if the model is fine or it's overfitted on

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread Sebastian Raschka
Hi, Shalu, One example for plotting decision regions would be here: http://scikit-learn.org/stable/auto_examples/plot_classifier_comparison.html It's basically a brute force approach: You define 2D grid of points and then classifier each of those points. Also, the downside is that you can only

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread Gael Varoquaux
On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu jhanwar wrote: i) Can I do it with more features (I have 16 features)? How do you visualize a 16-features space? G -- Download BIRT iHub F-Type - The Free

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread shalu jhanwar
Generally I do PCA and can plot the reduced dimension of the data (PC1 and PC2). Here I'm interested in knowing the boundary decision of the classifier. S. On Fri, Feb 20, 2015 at 6:34 PM, Gael Varoquaux gael.varoqu...@normalesup.org wrote: On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread Sebastian Raschka
i) That would be quite a challenge for the human brain: In the best case you have a hyperplane in 16 dimensions :). How can be put that into a scatter plot!? :) ii + iii) When I understand correctly, you want to get an idea about the generalization error? The simplest way would maybe to look

Re: [Scikit-learn-general] Regarding viewing the decision boundaries of classifiers

2015-02-20 Thread ragv ragv
iii) What would be the best way to know whether the model is fine or overfitted according to your experience? Take a look at this answer by Lars - http://stackoverflow.com/a/12254521/4016687 -- Download BIRT iHub F-Type