On Fri, Nov 4, 2011 at 4:54 PM, Andreas Müller <[email protected]> wrote: > On 11/04/2011 03:49 PM, Andreas Müller wrote: >> On 11/04/2011 03:42 PM, Alexandre Passos wrote: >>> On Fri, Nov 4, 2011 at 10:34, Lars Buitinck <[email protected]> wrote: >>>> 2011/11/4 Alexandre Passos <[email protected]>: >>>>> I have a question: why not just use Theano for this? I doubt that we >>>>> can write neural network code that's as fast as their automatically >>>>> generated code. >>>> Would that mean an extra run-time dependency? >>> Yes, as theano needs a compiler (gcc or nvcc if you want to use cuda) >>> available at run time, but even still it's faster even than most >>> hand-coded implementations of neural networks. James Bergstra reads >>> this list occasionally, and he's one of the main people behind theano, >>> so he can give more info here. >>> >>> >> > As an afterthought: you could use the same argument for SGD, > kmeans and many other algorithms inside sklearn. > Do you want all of them to be replaced by theano implementations?
This is a great point. Also, I'd like to cite Gael's recent remark "Machine learning should be a commodity", so if we could have simpler code with less dependencies that would do just well enough, then a niche would be filled, in my opinion. Vlad ------------------------------------------------------------------------------ RSA(R) Conference 2012 Save $700 by Nov 18 Register now http://p.sf.net/sfu/rsa-sfdev2dev1 _______________________________________________ Scikit-learn-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
