Nathaniel and Ian both mention adding code to Numpy to explicitly support
Eigen.

It seems to me that a potentially better route than "add code to Numpy to
support BLAS library" for each library is to make Numpy easy to configure
to compile with an arbitrary BLAS library (like what I've been doing).

Reasoning behind this:
* numpy.distutils.system_info is already 2300 lines of code. This code (and
the full process of linking to BLAS) are not very well documented or
flexible, as evidenced by the trouble I had linking against custom BLAS and
the fact that changes to numpy's source were required. Adding more code to
this seems to make things worse rather than better in my opinion.

* I think the "support arbitrary BLAS library" option is better plane on
the "ease of use" to "robustness" than adding support for particular
libraries.

* BLAS is an interface. Shouldn't we (as Numpy developers) take full
advantage of the interface and the generality it allows?

As I mentioned in my first email, the current user experience of linking
Numpy to arbitrary BLAS consists of (0) compile/acquire BLAS binary, (1)
modify Numpy source, (2) compile CBLAS against your BLAS, and (3) modify
site.cfg and build. Step 0 is outside the scope of Numpy, and none of steps
1-3 were very well documented. I think the ideal user experience would
consist solely of steps 0 and 3, that is just acquiring a BLAS and then
configuring Numpy to build against that BLAS, and then building Numpy. I'm
not sure if it's technically possible, but it would be an improvement on
this if the user could change the BLAS for a numpy install without
rebuilding (ie Numpy optionally loads some site.cfg equivalent each time it
opens and dynamically links based off of that, or something clever with
symlinks is done). I believe this is technically possible with symlinks
(see Ubuntu's update-alternatives program).

I realize this would require a large rewrite of numpy.distutils, but this
approach seems far simpler. The building code could do something like: does
the user specify a BLAS? -> is ATLAS installed? -> is openBLAS installed?
-> use some default.
The same approach could be applied to LAPACK and FFT libraries (even though
FFT might be a bit trickier since the interface is less standard).

Thoughts on this general approach?

-Eric


On Mon, Jul 13, 2015 at 10:32 AM, Ian Henriksen <
insertinterestingnameh...@gmail.com> wrote:

> On Mon, Jul 13, 2015 at 7:55 AM Nathaniel Smith <n...@pobox.com> wrote:
>
>> On Jul 13, 2015 1:44 AM, "Eric Martin" <e...@ericmart.in> wrote:
>> >
>> > My procedure questions:
>> > Is the installation procedure I outlined above reasonable, or does it
>> contain steps that could/should be removed? Having to edit Numpy source
>> seems sketchy to me. I largely came up with this procedure by looking up
>> tutorials online and by trial and error. I don't want to write
>> documentation that encourages people to do something in a non-optimal way,
>> so if there is a better way to do this, please let me know.
>>
>> I'll leave others with more knowledge to answer your other questions, but
>> if you're interested in making it easy for others to link numpy against
>> these libraries, I'd suggest modifying the numpy source further, by
>> submitting a patch that teaches numpy.distutils how to detect and link
>> against these libraries automatically :-). There are several libraries we
>> already know how to do this for that you can compare against for reference,
>> and this will also help for other libraries that use blas and
>> numpy.distutils, like scipy.
>>
> Supporting Eigen sounds like a great idea. BLIS would be another
> one worth supporting at some point. As far as implementation goes, it may
> be helpful to look at https://github.com/numpy/numpy/pull/3642 and
> https://github.com/numpy/numpy/pull/4191 for the corresponding set of
> changes for OpenBLAS. That could be a good starting point.
> Thanks for bringing this up!
> -Ian Henriksen
>
>> > Eigen has excellent performance. On my i5-5200U (Broadwell) CPU, I
>> found Eigen BLAS compiled with AVX and FMA instructions to take 3.93s to
>> multiply 2 4000x4000 double matrices with a single thread, while my install
>> of Numpy from ubuntu took 9s (and used 4 threads on my 2 cores). My Ubuntu
>> numpy appears to built against "libblas", which I think is the reference
>> implementation.
>>
>> If you're using the numpy packages distributed by Ubuntu, then it should
>> be possible to switch to openblas just by apt installing openblas and then
>> maybe fiddling with update-alternatives.
>>
>> -n
>> _______________________________________________
>> NumPy-Discussion mailing list
>> NumPy-Discussion@scipy.org
>> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>>
>
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to