On 21/02/15 23:20, Sturla Molden wrote:
I have discovered a truly marvelous proof, which this margin is too
narrow to contain. ;-)
A more bizarre story...
Last time I said something like that was in 2004, when a postdoc, a
fellow PhD student and I had found some strange hexagonal patterns in
On 20/02/15 18:34, Gael Varoquaux wrote:
On Fri, Feb 20, 2015 at 05:27:12PM +0100, shalu jhanwar wrote:
i) Can I do it with more features (I have 16 features)?
How do you visualize a 16-features space?
I think I know of a rather general solution to this problem. But I don't
think I should
I have discovered a truly marvelous proof, which this margin is too
narrow to contain. ;-)
Sturla
On 21/02/15 22:58, Vlad Niculae wrote:
Apologies in advance, but this fits so well, I couldn’t help myself.
A Mathematician and an Engineer attend a lecture by a Physicist. The topic
concerns
One way to encourage people to use the scorer API more would be to add a
more direct interface like:
def score(scoring, estimator, X, y=None, **kwargs):
return get_scorer(scoring)(estimator, X, y, **kwargs)
On 20 February 2015 at 20:58, Mathieu Blondel math...@mblondel.org wrote:
On
Apologies in advance, but this fits so well, I couldn’t help myself.
A Mathematician and an Engineer attend a lecture by a Physicist. The topic
concerns Kulza-Klein theories involving physical processes that occur in spaces
with dimensions of 9, 12 and even higher. The Mathematician is sitting,
On 20/02/15 14:29, shalu jhanwar wrote:
Hi guys,
I am using SVM and Random forest classifiers from scikit learn. I wonder
is it possible to plot the decision boundary of the model on my own
training dataset so that I can have a feeling of the data? Is there any
in-built example available in