I think for neural nets there is an actual benefit, but in general ml I
haven't seen a lot of improvement.
On May 7, 2014 12:11 PM, "Lars Buitinck" <larsm...@gmail.com> wrote:
> 2014-05-07 9:41 GMT+02:00 Matthieu Brucher <matthieu.bruc...@gmail.com>:
> > IMHO GPU will be usable when CPU and GPU memories will be integrated
> without
> > move cost. Before, GPU will be hype without mainstream usage.
>
> My thought exactly. Without mapping device memory into the CPU's
> virtual memory, no proper subclass of np.ndarray is possible, every
> "GPU NumPy" needs to sacrifice compatibility for performance, and
> every package that uses it needs to be GPU-aware. (Or, in the present
> state of affairs, either CUDA or OpenCL-aware...)
>
>
> ------------------------------------------------------------------------------
> Is your legacy SCM system holding you back? Join Perforce May 7 to find
> out:
> • 3 signs your SCM is hindering your productivity
> • Requirements for releasing software faster
> • Expert tips and advice for migrating your SCM now
> http://p.sf.net/sfu/perforce
> _______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
------------------------------------------------------------------------------
Is your legacy SCM system holding you back? Join Perforce May 7 to find out:
• 3 signs your SCM is hindering your productivity
• Requirements for releasing software faster
• Expert tips and advice for migrating your SCM now
http://p.sf.net/sfu/perforce
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general