2014-05-07 9:41 GMT+02:00 Matthieu Brucher <matthieu.bruc...@gmail.com>:
> IMHO GPU will be usable when CPU and GPU memories will be integrated without
> move cost. Before, GPU will be hype without mainstream usage.

My thought exactly. Without mapping device memory into the CPU's
virtual memory, no proper subclass of np.ndarray is possible, every
"GPU NumPy" needs to sacrifice compatibility for performance, and
every package that uses it needs to be GPU-aware. (Or, in the present
state of affairs, either CUDA or OpenCL-aware...)

------------------------------------------------------------------------------
Is your legacy SCM system holding you back? Join Perforce May 7 to find out:
&#149; 3 signs your SCM is hindering your productivity
&#149; Requirements for releasing software faster
&#149; Expert tips and advice for migrating your SCM now
http://p.sf.net/sfu/perforce
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to