Re: [Numpy-discussion] Move scipy.org docs to Github?

2017-03-15 Thread Matthew Brett
Hi,

On Wed, Mar 15, 2017 at 4:20 PM, Pauli Virtanen  wrote:
> Wed, 15 Mar 2017 14:56:52 -0700, Nathaniel Smith kirjoitti:
> [clip]
>> As long as we can fit under the 1 gig size limit then GH pages seems
>> like the best option so far... it's reliable, widely understood, and all
>> of the limits besides the 1 gig size are soft limits where they say
>> they'll work with us to figure things out.
>
> The Scipy html docs weigh 60M apiece and numpy is 35M, so it can be done
> if a limited number of releases are kept, and the rest and the auxiliary
> files are put as release downloads.
>
> Otherwise there's probably no problem as you can stick a CDN in front if
> it's too heavy otherwise.
>
> Sounds sensible? Certainly it's the lowest-effort approach, and would
> simplify management of S3/etc origin site access permissions.

Sounds very sensible to me,

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Move scipy.org docs to Github?

2017-03-15 Thread Matthew Brett
Hi,

On Wed, Mar 15, 2017 at 2:47 AM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
>
> On Wed, Mar 15, 2017 at 3:21 PM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> Hi,
>>
>> The scipy.org site is down at the moment, and has been for more than 36
>> hours:
>>
>> https://github.com/numpy/numpy/issues/8779#issue-213781439
>>
>> This has happened before:
>>
>> https://github.com/scipy/scipy.org/issues/187#issue-186426408
>>
>> I think it was down for about 24 hours that time.
>>
>> From the number of people opening issues or commenting on the
>> scipy.org website this time, it seems to be causing quite a bit of
>> disruption.
>>
>> It seems to me that we would have a much better chances of avoiding
>> significant down-time, if we switched to hosting the docs on github
>> pages.
>>
>> What do y'all think?
>
>
> Once the site is back up we should look at migrating to a better (hosted)
> infrastructure. I suspect that Github Pages won't work, we'll exceed or be
> close to exceeding both the 1 GB site size limit and the 100 GB/month
> bandwidth limit [1].
>
> Rough bandwidth estimate (using page size from
> http://scipy.github.io/devdocs/ and Alexa stats): 2 million visits per
> month, 2.5 page views per visit, 5 kb/page = 25 GB/month (html). Add to that
> pdf docs, which are ~20 MB in size: if only a small fraction of visitors
> download those, we'll be at >100 GB.

Maybe we could host the PDF docs somewhere else?   I wonder if Github
would consider allowing us to go a bit over if necessary?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Move scipy.org docs to Github?

2017-03-14 Thread Matthew Brett
Hi,

The scipy.org site is down at the moment, and has been for more than 36 hours:

https://github.com/numpy/numpy/issues/8779#issue-213781439

This has happened before:

https://github.com/scipy/scipy.org/issues/187#issue-186426408

I think it was down for about 24 hours that time.

>From the number of people opening issues or commenting on the
scipy.org website this time, it seems to be causing quite a bit of
disruption.

It seems to me that we would have a much better chances of avoiding
significant down-time, if we switched to hosting the docs on github
pages.

What do y'all think?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Building with Clang on Windows?

2017-03-10 Thread Matthew Brett
Hi,

Has anyone succeeded in building numpy with Clang on Windows?  Or,
building any other Python extension with Clang on Windows?  I'd love
to hear about how you did it...

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy Overhead

2017-02-28 Thread Matthew Brett
Hi,

On Tue, Feb 28, 2017 at 3:04 PM, Sebastian K
 wrote:
> Yes you are right. There is no need to add that line. I deleted it. But the
> measured heap peak is still the same.

You're applying the naive matrix multiplication algorithm, which is
ideal for minimizing memory use during the computation, but terrible
for speed-related stuff like keeping values in the CPU cache:

https://en.wikipedia.org/wiki/Matrix_multiplication_algorithm

The Numpy version is likely calling into a highly optimized compiled
routine for matrix multiplication, which can load chunks of the
matrices at a time, to speed up computation.   If you really need
minimum memory heap usage and don't care about the order of
magnitude(s) slowdown, then you might need to use the naive method,
maybe implemented in Cython / C.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy Overhead

2017-02-28 Thread Matthew Brett
Hi,

On Tue, Feb 28, 2017 at 2:12 PM, Sebastian K
 wrote:
> Thank you for your answer.
> For example a very simple algorithm is a matrix multiplication. I can see
> that the heap peak is much higher for the numpy version in comparison to a
> pure python 3 implementation.
> The heap is measured with the libmemusage from libc:
>
>   heap peak
>   Maximum of all size arguments of malloc(3), all products
>   of nmemb*size of calloc(3), all size arguments of
>   realloc(3), length arguments of mmap(2), and new_size
>   arguments of mremap(2).

Could you post the exact code you're comparing?

I think you'll find that a naive Python 3 matrix multiplication method
is much, much slower than the same thing with Numpy, with arrays of
any reasonable size.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Daily numpy wheel builds - prefer to per-commit builds?

2017-02-28 Thread Matthew Brett
Hi,

I've been working to get daily travis-ci cron-job manylinux builds
working for numpy and scipy wheels.   They are now working OK:

https://travis-ci.org/MacPython/numpy-wheels
https://travis-ci.org/MacPython/scipy-wheels
https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com

These are daily builds of the respective numpy and scipy master branches.

Numpy already has a system, kindly worked up by Olivier Grisel, which
uploads wheel builds from each travis-ci run. travis-ci uploads these
wheels for every commit, at the
"travis-dev-wheels" container on Rackspace, visible at
https://f66d8a5767b134cb96d3-4ffdece11fd3f72855e4665bc61c7445.ssl.cf2.rackcdn.com
. These builds are specific to the Linux they were built on - in this
case Ubuntu 12.04.

Some projects use these per-commit builds for testing compatibility
with development Numpy code.

Now I'm wondering what the relationship should be between the current
every-commit builds and the new wheel builds.

I think that we should prefer the new wheel builds and deprecate the
previous per-commit builds, because:

* the new manylinux wheels are self-contained, and so can be installed
with pip without extra lines of `apt` installs;
* the manylinux builds work on any travis container, not just the current
12.04 container;
* manylinux builds should be faster, as they are linked against OpenBLAS;
* manylinux wheels are closer to the wheels we distribute for
releases, and therefore more useful for testing against.

What do y'all think?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] automatically avoiding temporary arrays

2017-02-27 Thread Matthew Brett
Hi,

On Mon, Feb 27, 2017 at 11:46 AM, Benjamin Root  wrote:
> Ah, that wheel would be a huge help. Most of the packages I have as
> dependencies for this project were compiled against v1.10, so I am hoping
> that there won't be too big of a problem.

I don't think so - the problems generally arise when you try to import
a package compiled against a more recent version of numpy than the one
you have installed.

> Are the mostlinux wheels still
> compatible with CentOS5, or has the base image been bumped to CentOS6?

Yup, still CentOS5.  When it shifts to CentOS6, probably not very
soon, you'll likely see a version number bump to manylinux2
(manylinux1 is the current version).

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] automatically avoiding temporary arrays

2017-02-27 Thread Matthew Brett
Hi,

On Mon, Feb 27, 2017 at 11:27 AM, Charles R Harris
 wrote:
>
>
> On Mon, Feb 27, 2017 at 11:43 AM, Benjamin Root 
> wrote:
>>
>> What's the timeline for the next release? I have the perfect usecase for
>> this (Haversine calculation on large arrays that takes up ~33% of one of my
>> processing scripts). However, to test it out, I have a huge dependency mess
>> to wade through first, and there are no resources devoted to that project
>> for at least a few weeks. I want to make sure I get feedback to y'all.
>
>
> I'd like to branch 1.13.x at the end of March. The planned features that
> still need to go in are the `__array_ufunc__` work and the `lapack_lite`
> update. The first RC should not take much longer. I believe Matthew is
> building wheels for testing on the fly but I don't know where you can find
> them.

Latest wheels at :
https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com/

PRE_URL=https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com
pip install -f $PRE_URL --pre numpy

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Removal of some numpy files

2017-02-25 Thread Matthew Brett
On Sat, Feb 25, 2017 at 7:48 AM, David Cournapeau  wrote:
> tools/win32build is used to build the so-called superpack installers, which
> we don't build anymore AFAIK
>
> tools/numpy-macosx-installer is used to build the .dmg for numpy (also not
> used anymore AFAIK).

No, we aren't using the .dmg script anymore, dmg installers have been
fully replaced by wheels.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-24 Thread Matthew Brett
On Wed, Feb 15, 2017 at 6:53 PM, Sandro Tosi  wrote:
>> A recent post to the wheel-builders mailing list pointed out some
>> links to places providing free PowerPC hosting for open source
>> projects, if they agree to a submitted request:
>
> The debian project has some powerpc machines (and we still build numpy
> on those boxes when i upload a new revision to our archives) and they
> also have hosts dedicated to let debian developers login and debug
> issues with their packages on that architecture. I can sponsor access
> to those machines for some of you, but it is not a place where you can
> host a CI instance.
>
> Just keep it in mind more broadly than powerpc, f.e. these are all the
> archs where numpy was built after the last upload
> https://buildd.debian.org/status/package.php?p=python-numpy=unstable
> (the grayed out archs are the ones non release critical, so packages
> are built as best effort and if missing is not a big deal)

Numpy master now passes all tests on PPC64el.  Still a couple of
remaining failures for PPC64 (big-endian):

* https://github.com/numpy/numpy/pull/8566
* https://github.com/numpy/numpy/issues/8325

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-21 Thread Matthew Brett
Hi,

On Thu, Feb 16, 2017 at 12:50 AM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
>
> On Thu, Feb 16, 2017 at 8:58 AM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> On Wed, Feb 15, 2017 at 7:55 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>> wrote:
>> >
>> >
>> > On Thu, Feb 16, 2017 at 8:45 AM, Matthew Brett <matthew.br...@gmail.com>
>> > wrote:
>> >>
>> >> On Wed, Feb 15, 2017 at 7:37 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>> >> wrote:
>> >> >
>> >> >
>> >> > On Thu, Feb 16, 2017 at 8:02 AM, Matthew Brett
>> >> > <matthew.br...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> Hey,
>> >> >>
>> >> >> A recent post to the wheel-builders mailing list pointed out some
>> >> >> links to places providing free PowerPC hosting for open source
>> >> >> projects, if they agree to a submitted request:
>> >> >>
>> >> >>
>> >> >>
>> >> >> https://mail.python.org/pipermail/wheel-builders/2017-February/000257.html
>> >> >>
>> >> >> It would be good to get some testing going on these architectures.
>> >> >> Shall we apply for hosting, as the numpy organization?
>> >> >
>> >> >
>> >> > Those are bare VMs it seems. Remembering the Buildbot and Mailman
>> >> > horrors, I
>> >> > think we should be very reluctant to taking responsibility for
>> >> > maintaining
>> >> > CI on anything that's not hosted and can be controlled with a simple
>> >> > config
>> >> > file in our repo.
>> >>
>> >> Not sure what you mean about mailman - maybe the Enthought servers we
>> >> didn't have access to?
>> >
>> >
>> > We did have access (for most of the time), it's just that no one is
>> > interested in putting in lots of hours on sysadmin duties.
>> >
>> >>
>> >> For buildbot, I've been maintaining about 12
>> >> crappy old machines for about 7 years now [1] - I'm happy to do the
>> >> same job for a couple of properly hosted PPC machines.
>> >
>> >
>> > That's awesome persistence. The NumPy and SciPy buildbots certainly
>> > weren't
>> > maintained like that, half of them were offline or broken for long
>> > periods
>> > usually.
>>
>> Right - they do need persistence, and to have someone who takes
>> responsibility for them.
>>
>> >>
>> >>  At least we'd
>> >> have some way of testing for these machines, if we get stuck - even if
>> >> that involved spinning up a VM and installing the stuff we needed from
>> >> the command line.
>> >
>> >
>> > I do see the value of testing on more platforms of course. It's just
>> > about
>> > logistics/responsibilities. If you're saying that you'll do the
>> > maintenance,
>> > and want to apply for resources using the NumPy name, that's much better
>> > I
>> > think then making "the numpy devs" collectively responsible.
>>
>> Yes, exactly.  I'm happy to take responsibility for them, I just
>> wanted to make sure that numpy devs could get at them if I'm not
>> around for some reason.
>
>
> In that case, +1 from me!

OK - IBM have kindly given me access to a testing machine, via my own
SSH public key.   Would it make sense to have a Numpy key, with
several people having access to the private key and passphrase?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-15 Thread Matthew Brett
On Wed, Feb 15, 2017 at 7:55 PM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
>
> On Thu, Feb 16, 2017 at 8:45 AM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> On Wed, Feb 15, 2017 at 7:37 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>> wrote:
>> >
>> >
>> > On Thu, Feb 16, 2017 at 8:02 AM, Matthew Brett <matthew.br...@gmail.com>
>> > wrote:
>> >>
>> >> Hey,
>> >>
>> >> A recent post to the wheel-builders mailing list pointed out some
>> >> links to places providing free PowerPC hosting for open source
>> >> projects, if they agree to a submitted request:
>> >>
>> >>
>> >> https://mail.python.org/pipermail/wheel-builders/2017-February/000257.html
>> >>
>> >> It would be good to get some testing going on these architectures.
>> >> Shall we apply for hosting, as the numpy organization?
>> >
>> >
>> > Those are bare VMs it seems. Remembering the Buildbot and Mailman
>> > horrors, I
>> > think we should be very reluctant to taking responsibility for
>> > maintaining
>> > CI on anything that's not hosted and can be controlled with a simple
>> > config
>> > file in our repo.
>>
>> Not sure what you mean about mailman - maybe the Enthought servers we
>> didn't have access to?
>
>
> We did have access (for most of the time), it's just that no one is
> interested in putting in lots of hours on sysadmin duties.
>
>>
>> For buildbot, I've been maintaining about 12
>> crappy old machines for about 7 years now [1] - I'm happy to do the
>> same job for a couple of properly hosted PPC machines.
>
>
> That's awesome persistence. The NumPy and SciPy buildbots certainly weren't
> maintained like that, half of them were offline or broken for long periods
> usually.

Right - they do need persistence, and to have someone who takes
responsibility for them.

>>
>>  At least we'd
>> have some way of testing for these machines, if we get stuck - even if
>> that involved spinning up a VM and installing the stuff we needed from
>> the command line.
>
>
> I do see the value of testing on more platforms of course. It's just about
> logistics/responsibilities. If you're saying that you'll do the maintenance,
> and want to apply for resources using the NumPy name, that's much better I
> think then making "the numpy devs" collectively responsible.

Yes, exactly.  I'm happy to take responsibility for them, I just
wanted to make sure that numpy devs could get at them if I'm not
around for some reason.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-15 Thread Matthew Brett
On Wed, Feb 15, 2017 at 7:37 PM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
>
> On Thu, Feb 16, 2017 at 8:02 AM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> Hey,
>>
>> A recent post to the wheel-builders mailing list pointed out some
>> links to places providing free PowerPC hosting for open source
>> projects, if they agree to a submitted request:
>>
>> https://mail.python.org/pipermail/wheel-builders/2017-February/000257.html
>>
>> It would be good to get some testing going on these architectures.
>> Shall we apply for hosting, as the numpy organization?
>
>
> Those are bare VMs it seems. Remembering the Buildbot and Mailman horrors, I
> think we should be very reluctant to taking responsibility for maintaining
> CI on anything that's not hosted and can be controlled with a simple config
> file in our repo.

Not sure what you mean about mailman - maybe the Enthought servers we
didn't have access to?  For buildbot, I've been maintaining about 12
crappy old machines for about 7 years now [1] - I'm happy to do the
same job for a couple of properly hosted PPC machines.   At least we'd
have some way of testing for these machines, if we get stuck - even if
that involved spinning up a VM and installing the stuff we needed from
the command line.

Cheers,

Matthew

[1] http://nipy.bic.berkeley.edu/buildslaves
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] PowerPC testing servers

2017-02-15 Thread Matthew Brett
Hey,

A recent post to the wheel-builders mailing list pointed out some
links to places providing free PowerPC hosting for open source
projects, if they agree to a submitted request:

https://mail.python.org/pipermail/wheel-builders/2017-February/000257.html

It would be good to get some testing going on these architectures.
Shall we apply for hosting, as the numpy organization?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] composing Euler rotation matrices

2017-02-01 Thread Matthew Brett
Hi,

On Wed, Feb 1, 2017 at 8:28 AM, Robert McLeod <robbmcl...@gmail.com> wrote:
> Instead of trying to decipher what someone wrote on a Wikipedia, why don't
> you look at a working piece of source code?
>
> e.g.
>
> https://github.com/3dem/relion/blob/master/src/euler.cpp

Also - have a look at https://pypi.python.org/pypi/transforms3d - and
in particular you might get some use from symbolic versions of the
transformations, e.g. here :
https://github.com/matthew-brett/transforms3d/blob/master/transforms3d/derivations/eulerangles.py

It's really easy to mix up the conventions, as I'm sure you know - see
http://matthew-brett.github.io/transforms3d/reference/transforms3d.euler.html

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Numpy development version wheels for testing

2017-01-27 Thread Matthew Brett
Hi,

I've taken advantage of the new travis-ci cron job feature [1] to set
up daily builds of numpy manylinux and OSX wheels for the current
trunk, uploading to:

https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com

The numpy build process already builds Ubuntu Precise numpy wheels for
the current trunk, available at [2], but the cron-job manylinux wheels
have the following advantages:

* they are built the same way as our usual pypi wheels, using
openblas, and so will be closer to the eventual numpy distributed
wheel;
* manylinux wheels will install on all the travis-ci containers, not
just the Precise container;
* manylinux wheels don't need any extra packages installed by apt,
because they are self-contained.

There's an example of use at
https://github.com/matthew-brett/nibabel/blob/use-pre/.travis.yml#L23

Cheers,

Matthew

[1] https://docs.travis-ci.com/user/cron-jobs
[2] 
https://f66d8a5767b134cb96d3-4ffdece11fd3f72855e4665bc61c7445.ssl.cf2.rackcdn.com
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] NumPy 1.12.0 release

2017-01-17 Thread Matthew Brett
On Tue, Jan 17, 2017 at 3:47 PM, Neal Becker <ndbeck...@gmail.com> wrote:
> Matthew Brett wrote:
>
>> Hi,
>>
>> On Tue, Jan 17, 2017 at 5:56 AM, Neal Becker <ndbeck...@gmail.com> wrote:
>>> Charles R Harris wrote:
>>>
>>>> Hi All,
>>>>
>>>> I'm pleased to announce the NumPy 1.12.0 release. This release supports
>>>> Python 2.7 and 3.4-3.6. Wheels for all supported Python versions may be
>>>> downloaded from PiPY
>>>> <https://pypi.python.org/pypi?%3Aaction=pkg_edit=numpy>, the
>>>> tarball and zip files may be downloaded from Github
>>>> <https://github.com/numpy/numpy/releases/tag/v1.12.0>. The release notes
>>>> and files hashes may also be found at Github
>>>> <https://github.com/numpy/numpy/releases/tag/v1.12.0> .
>>>>
>>>> NumPy 1.12.0rc 2 is the result of 418 pull requests submitted by 139
>>>> contributors and comprises a large number of fixes and improvements.
>>>> Among
>>>> the many improvements it is difficult to  pick out just a few as
>>>> standing above the others, but the following may be of particular
>>>> interest or indicate areas likely to have future consequences.
>>>>
>>>> * Order of operations in ``np.einsum`` can now be optimized for large
>>>> speed improvements.
>>>> * New ``signature`` argument to ``np.vectorize`` for vectorizing with
>>>> core dimensions.
>>>> * The ``keepdims`` argument was added to many functions.
>>>> * New context manager for testing warnings
>>>> * Support for BLIS in numpy.distutils
>>>> * Much improved support for PyPy (not yet finished)
>>>>
>>>> Enjoy,
>>>>
>>>> Chuck
>>>
>>> I've installed via pip3 on linux x86_64, which gives me a wheel.  My
>>> question is, am I loosing significant performance choosing this pre-built
>>> binary vs. compiling myself?  For example, my processor might have some
>>> more features than the base version used to build wheels.
>>
>> I guess you are thinking about using this built wheel on some other
>> machine?   You'd have to be lucky for that to work; the wheel depends
>> on the symbols it found at build time, which may not exist in the same
>> places on your other machine.
>>
>> If it does work, the speed will primarily depend on your BLAS library.
>>
>> The pypi wheels should be pretty fast; they are built with OpenBLAS,
>> which is at or near top of range for speed, across a range of
>> platforms.
>>
>> Cheers,
>>
>> Matthew
>
> I installed using pip3 install, and it installed a wheel package.  I did not
> build it - aren't wheels already compiled packages?  So isn't it built for
> the common denominator architecture, not necessarily as fast as one I built
> myself on my own machine?  My question is, on x86_64, is this potential
> difference large enough to bother with not using precompiled wheel packages?

Ah - my guess is that you'd be hard pressed to make a numpy that is as
fast as the precompiled wheel.   The OpenBLAS library included in
numpy selects the routines for your CPU at run-time, so they will
generally be fast on your CPU.   You might be able to get equivalent
or even better performance with a ATLAS BLAS library recompiled on
your exact machine, but that's quite a serious investment of time to
get working, and you'd have to benchmark to find if you were really
doing any better.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy 1.12.0 release

2017-01-17 Thread Matthew Brett
Hi,

On Tue, Jan 17, 2017 at 5:56 AM, Neal Becker  wrote:
> Charles R Harris wrote:
>
>> Hi All,
>>
>> I'm pleased to announce the NumPy 1.12.0 release. This release supports
>> Python 2.7 and 3.4-3.6. Wheels for all supported Python versions may be
>> downloaded from PiPY
>> , the tarball
>> and zip files may be downloaded from Github
>> . The release notes
>> and files hashes may also be found at Github
>>  .
>>
>> NumPy 1.12.0rc 2 is the result of 418 pull requests submitted by 139
>> contributors and comprises a large number of fixes and improvements. Among
>> the many improvements it is difficult to  pick out just a few as standing
>> above the others, but the following may be of particular interest or
>> indicate areas likely to have future consequences.
>>
>> * Order of operations in ``np.einsum`` can now be optimized for large
>> speed improvements.
>> * New ``signature`` argument to ``np.vectorize`` for vectorizing with core
>> dimensions.
>> * The ``keepdims`` argument was added to many functions.
>> * New context manager for testing warnings
>> * Support for BLIS in numpy.distutils
>> * Much improved support for PyPy (not yet finished)
>>
>> Enjoy,
>>
>> Chuck
>
> I've installed via pip3 on linux x86_64, which gives me a wheel.  My
> question is, am I loosing significant performance choosing this pre-built
> binary vs. compiling myself?  For example, my processor might have some more
> features than the base version used to build wheels.

I guess you are thinking about using this built wheel on some other
machine?   You'd have to be lucky for that to work; the wheel depends
on the symbols it found at build time, which may not exist in the same
places on your other machine.

If it does work, the speed will primarily depend on your BLAS library.

The pypi wheels should be pretty fast; they are built with OpenBLAS,
which is at or near top of range for speed, across a range of
platforms.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] question about long doubles on ppc64el

2017-01-16 Thread Matthew Brett
Hi,

On Sun, Jan 15, 2017 at 10:00 PM, Thomas Caswell  wrote:
> Folks,
>
> Over at h5py we are trying to get a release out and have discovered (via
> debian) that on ppc64el there is an apparent disagreement between the size
> of a native long double according to hdf5 and numpy.
>
> For all of the gorey details see: https://github.com/h5py/h5py/issues/817 .
>
> In short, `np.longdouble` seems to be `np.float128` and according to the
> docs should map to the native 'long double'.  However, hdf5 provides a
> `H5T_NATIVE_LDOUBLE` which should also refer to the native 'long double',
> but seems to be a 64 bit float.
>
> Anyone on this list have a ppc64el machine (or experience with) that can
> provide some guidance here?

I know that long double on numpy for the PPC on Mac G4 (power64 arch)
is the twin double, as expected, so I'd be surprised if that wasn't
true for numpy on ppc64el .

Do you want a login for the G4 running Jessie?   If so, send me your
public key off-list?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Wheel files for PPC64

2016-11-23 Thread Matthew Brett
Hi,

On Wed, Nov 23, 2016 at 6:47 AM, Leonardo Bianconi
 wrote:
> Hi all!
>
>
>
> I realized that there is no “whl” files for PPC64
> (https://pypi.python.org/pypi/numpy/1.12.0b1), which makes numpy be compiled
> when installing it with pip. It is causing a low performance on this
> architecture when the libopenblas is not previously installed, while for
> x86_64 it is installed as a lib, which is in the “whl” file.
>
>
>
> What must to be done to add a build of numpy for PPC64, and upload the “whl”
> files to pypi link?

The problem for wheel files is agreeing to some standard for binary
compatibility that guarantees the wheel will work on more than a
single Unix distribution / version.

The work to do that for Intel Linux is over at
https://www.python.org/dev/peps/pep-0513 .  As you can see, that PEP
defines a standard set of libraries and library symbol versions to
build against, such that the resulting wheel will be compatible with a
large number of Intel Linux distributions.There's also a standard
docker container to build with, containing the correct libraries /
symbols.

Otherwise, you'll have to build the wheels for your platform only, and
put them up somewhere in a web directory, for others with the same
platform.  It doesn't make sense to put them on pypi, because, unless
you have done the standardization work, they will likely fail for most
people.

Best,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy 1.12.0b1 released

2016-11-18 Thread Matthew Brett
Hi,

On Thu, Nov 17, 2016 at 3:24 PM, Matti Picus  wrote:
> Congrats to all on the release.Two questions:
>
> Is there a guide to building standard wheels for NumPy?

I don't think so - there is a repository that we use to build the
wheels, that has the Windows, OSX and manyllinux recipes for the
standard CPython build:

https://github.com/MacPython/numpy-wheelso

If you can work out a way to automate the PyPy builds and tests -
especially using the same repo - that would be very useful.

> Assuming I can build standardized PyPy 2.7 wheels for Ubuntu, Win32 and
> OSX64, how can I get them blessed and uploaded to PyPI?

If you can automate the build and tests, I'm guessing there will be no
objections - but it's not my call...

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy 1.11.2 released

2016-10-03 Thread Matthew Brett
On Mon, Oct 3, 2016 at 7:15 PM, Charles R Harris
 wrote:
> Hi All,
>
> I'm pleased to announce the release of Numpy 1.11.2. This release supports
> Python 2.6 - 2.7, and 3.2 - 3.5 and fixes bugs and regressions found in
> Numpy 1.11.1.  Wheels for Linux, Windows, and OSX can be found on PyPI.
> Sources are available on both PyPI and Sourceforge.
>
> Thanks to all who were involved in this release. Contributors and merged
> pull requests are listed below.
>
>
> Contributors to v1.11.2
>
> Allan Haldane
> Bertrand Lefebvre
> Charles Harris
> Julian Taylor
> Loïc Estève
> Marshall Bockrath-Vandegrift +
> Michael Seifert +
> Pauli Virtanen
> Ralf Gommers
> Sebastian Berg
> Shota Kawabuchi +
> Thomas A Caswell
> Valentin Valls +
> Xavier Abellan Ecija +
>
> A total of 14 people contributed to this release. People with a "+" by their
> names contributed a patch for the first time.
>
> Pull requests merged for v1.11.2
>
> #7736: Backport 4619, BUG: many functions silently drop keepdims kwarg
> #7738: Backport 5706, ENH: add extra kwargs and update doc of many MA...
> #7778: DOC: Update Numpy 1.11.1 release notes.
> #7793: Backport 7515, BUG: MaskedArray.count treats negative axes
> incorrectly
> #7816: Backport 7463, BUG: fix array too big error for wide dtypes.
> #7821: Backport 7817, BUG: Make sure npy_mul_with_overflow_ detects...
> #7824: Backport 7820, MAINT: Allocate fewer bytes for empty arrays.
> #7847: Backport 7791, MAINT,DOC: Fix some imp module uses and update...
> #7849: Backport 7848, MAINT: Fix remaining uses of deprecated Python...
> #7851: Backport 7840, Fix ATLAS version detection
> #7870: Backport 7853, BUG: Raise RuntimeError when reloading numpy is...
> #7896: Backport 7894, BUG: construct ma.array from np.array which
> contains...
> #7904: Backport 7903, BUG: fix float16 type not being called due to...
> #7917: BUG: Production install of numpy should not require nose.
> #7919: Backport 7908, BLD: Fixed MKL detection for recent versions of...
> #7920: Backport #7911: BUG: fix for issue#7835 (ma.median of 1d)
> #7932: Backport 7925, Monkey-patch _msvccompile.gen_lib_option like...
> #7939: Backport 7931, BUG: Check for HAVE_LDOUBLE_DOUBLE_DOUBLE_LE in...
> #7953: Backport 7937, BUG: Guard against buggy comparisons in generic...
> #7954: Backport 7952, BUG: Use keyword arguments to initialize Extension...
> #7955: Backport 7941, BUG: Make sure numpy globals keep identity after...
> #7972: Backport 7963, BUG: MSVCCompiler grows 'lib' & 'include' env...
> #7990: Backport 7977, DOC: Create 1.11.2 release notes.
> #8005: Backport 7956, BLD: remove __NUMPY_SETUP__ from builtins at end...
> #8007: Backport 8006, DOC: Update 1.11.2 release notes.
> #8010: Backport 8008, MAINT: Remove leftover imp module imports.
> #8012: Backport 8011, DOC: Update 1.11.2 release notes.
> #8020: Backport 8018, BUG: Fixes return for np.ma.count if keepdims...
> #8024: Backport 8016, BUG: Fix numpy.ma.median.
> #8031: Backport 8030, BUG: fix np.ma.median with only one non-masked...
> #8032: Backport 8028, DOC: Update 1.11.2 release notes.
> #8044: Backport 8042, BUG: core: fix bug in NpyIter buffering with
> discontinuous...
> #8046: Backport 8045, DOC: Update 1.11.2 release notes.

Thanks very much for doing all the release work, congratulations on the release,

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Accelerate or OpenBLAS for numpy / scipy wheels?

2016-06-28 Thread Matthew Brett
Hi,

On Tue, Jun 28, 2016 at 7:33 AM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
>
> On Tue, Jun 28, 2016 at 2:55 PM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> Hi,
>>
>> On Tue, Jun 28, 2016 at 5:25 AM, Charles R Harris
>> <charlesr.har...@gmail.com> wrote:
>> >
>> >
>> > On Mon, Jun 27, 2016 at 9:46 PM, Matthew Brett <matthew.br...@gmail.com>
>> > wrote:
>> >>
>> >> Hi,
>> >>
>> >> I just succeeded in getting an automated dual arch build of numpy and
>> >> scipy, using OpenBLAS.  See the last three build jobs in these two
>> >> build matrices:
>> >>
>> >> https://travis-ci.org/matthew-brett/numpy-wheels/builds/140388119
>> >> https://travis-ci.org/matthew-brett/scipy-wheels/builds/140684673
>> >>
>> >> Tests are passing on 32 and 64-bit.
>> >>
>> >> I didn't upload these to the usual Rackspace container at
>> >> wheels.scipy.org to avoid confusion.
>> >>
>> >> So, I guess the question now is - should we switch to shipping
>> >> OpenBLAS wheels for the next release of numpy and scipy?  Or should we
>> >> stick with the Accelerate framework that comes with OSX?
>> >>
>> >> In favor of the Accelerate build : faster to build, it's what we've
>> >> been doing thus far.
>
>
> Faster to build isn't really an argument right? Should be the same build
> time except for building OpenBLAS itself once per OpenBLAS version. And only
> applies to building wheels for releases - nothing changes for source builds
> done by users on OS X. If build time ever becomes a real issue, then
> dropping the dual arch stuff is probably the way to go - the 32-bit builds
> make very little sense these days.

Yes, that's true, but as you know, the OSX system and Python.org
Pythons are still dual arch, so technically a matching wheel should
also be dual arch.   I agree that we're near the point where there's
near zero likelihood that the 32-bit arch will ever get exercised.

> What we've been doing thus far - that is the more important argument.
> There's a risk in switching, we may encounter new bugs or lose some
> performance in particular functions.
>
>>
>> >>
>> >> In favor of OpenBLAS build : allows us to commit to one BLAS / LAPACK
>> >> library cross platform,
>
>
> This doesn't really matter too much imho, we have to support Accelerate
> either way.
>
>>
>> when we have the Windows builds working.
>> >> Faster to fix bugs with good support from main developer.  No
>> >> multiprocessing crashes for Python 2.7.
>
>
> This is probably the main reason to make the switch, if we decide to do
> that.
>
>>
>>   I'm still a bit nervous about OpenBLAS, see
>> > https://github.com/scipy/scipy/issues/6286. That was with version
>> > 0.2.18,
>> > which is pretty recent.
>>
>> Well - we are committed to OpenBLAS already for the Linux wheels, so
>> if that failure was due to an error in OpenBLAS, we'll have to report
>> it and get it fixed / fix it ourselves upstream.
>
>
> Indeed. And those wheels have been downloaded a lot already, without any
> issues being reported.
>
> I'm +0 on the proposal - the risk seems acceptable, but the reasons to make
> the switch are also not super compelling.

I guess I'm about +0.5 (multiprocessing, simplifying mainstream blas /
lapack support) - I'm floating it now because I hadn't got the build
machinery working before.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Accelerate or OpenBLAS for numpy / scipy wheels?

2016-06-28 Thread Matthew Brett
Hi,

On Tue, Jun 28, 2016 at 5:25 AM, Charles R Harris
<charlesr.har...@gmail.com> wrote:
>
>
> On Mon, Jun 27, 2016 at 9:46 PM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> Hi,
>>
>> I just succeeded in getting an automated dual arch build of numpy and
>> scipy, using OpenBLAS.  See the last three build jobs in these two
>> build matrices:
>>
>> https://travis-ci.org/matthew-brett/numpy-wheels/builds/140388119
>> https://travis-ci.org/matthew-brett/scipy-wheels/builds/140684673
>>
>> Tests are passing on 32 and 64-bit.
>>
>> I didn't upload these to the usual Rackspace container at
>> wheels.scipy.org to avoid confusion.
>>
>> So, I guess the question now is - should we switch to shipping
>> OpenBLAS wheels for the next release of numpy and scipy?  Or should we
>> stick with the Accelerate framework that comes with OSX?
>>
>> In favor of the Accelerate build : faster to build, it's what we've
>> been doing thus far.
>>
>> In favor of OpenBLAS build : allows us to commit to one BLAS / LAPACK
>> library cross platform, when we have the Windows builds working.
>> Faster to fix bugs with good support from main developer.  No
>> multiprocessing crashes for Python 2.7.
>
>
> I'm still a bit nervous about OpenBLAS, see
> https://github.com/scipy/scipy/issues/6286. That was with version 0.2.18,
> which is pretty recent.

Well - we are committed to OpenBLAS already for the Linux wheels, so
if that failure was due to an error in OpenBLAS, we'll have to report
it and get it fixed / fix it ourselves upstream.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Accelerate or OpenBLAS for numpy / scipy wheels?

2016-06-27 Thread Matthew Brett
Hi,

I just succeeded in getting an automated dual arch build of numpy and
scipy, using OpenBLAS.  See the last three build jobs in these two
build matrices:

https://travis-ci.org/matthew-brett/numpy-wheels/builds/140388119
https://travis-ci.org/matthew-brett/scipy-wheels/builds/140684673

Tests are passing on 32 and 64-bit.

I didn't upload these to the usual Rackspace container at
wheels.scipy.org to avoid confusion.

So, I guess the question now is - should we switch to shipping
OpenBLAS wheels for the next release of numpy and scipy?  Or should we
stick with the Accelerate framework that comes with OSX?

In favor of the Accelerate build : faster to build, it's what we've
been doing thus far.

In favor of OpenBLAS build : allows us to commit to one BLAS / LAPACK
library cross platform, when we have the Windows builds working.
Faster to fix bugs with good support from main developer.  No
multiprocessing crashes for Python 2.7.

Any thoughts?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Pip download stats for numpy

2016-06-24 Thread Matthew Brett
Hi,

I just ran a query on pypi downloads [1] using the BigQuery interface
to pypi stats [2].  It lists the numpy files downloaded from pypi via
a pip install, over the last two weeks, ordered by the number of
downloads:

1  100595 numpy-1.11.0.tar.gz
2  97754 numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
3  38471 numpy-1.8.1-cp27-cp27mu-manylinux1_x86_64.whl
4  20874 numpy-1.11.0-cp27-none-win_amd64.whl
5  20049 
numpy-1.11.0-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
6  17100 numpy-1.10.4-cp27-cp27mu-manylinux1_x86_64.whl
7  15187 numpy-1.10.1.zip
8  14277 numpy-1.11.0-cp35-cp35m-manylinux1_x86_64.whl
9  11538 numpy-1.9.1.tar.gz
10 11272 numpy-1.11.0-cp27-none-win32.whl

Of course, it's difficult to know how many of these are from automated
builds, such as from travis-ci, but it does look as if manylinux
wheels are getting some traction.

Cheers,

Matthew

[1]
SELECT
 COUNT(*) AS downloads,
 file.filename
FROM
  TABLE_DATE_RANGE(
[the-psf:pypi.downloads],
TIMESTAMP("20160610"),
CURRENT_TIMESTAMP()
  )
WHERE
 details.installer.name = 'pip'
 AND REGEXP_MATCH(file.filename, '^numpy-.*')
GROUP BY
 file.filename
ORDER BY
 downloads DESC
LIMIT
 1000

[2] https://mail.python.org/pipermail/distutils-sig/2016-May/028986.html
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] scipy 0.18 release candidate 1

2016-06-20 Thread Matthew Brett
Hi,

On Mon, Jun 20, 2016 at 6:31 AM, Evgeni Burovski
 wrote:
> Hi,
>
> I'm pleased to announce the availability of the first release
> candidate for scipy 0.18.0.
> Please try this release and report any issues on Github tracker,
> https://github.com/scipy/scipy, or scipy-dev mailing list.
> Source tarballs and release notes are available from Github releases,
> https://github.com/scipy/scipy/releases/tag/v0.18.0rc1
>
> Please note that this is a source-only release. We do not provide
> Windows binaries for this release. OS X and Linux wheels will be
> provided for the final release.
>
> The current release schedule is
>
> 27 June: rc2 (if necessary)
> 11 July:  final release
>
> Thanks to everyone who contributed to this release!
>
> Cheers,
>
> Evgeni
>
>
> A part of the release notes follows:
>
>
>
> ==
> SciPy 0.18.0 Release Notes
> ==
>
> .. note:: Scipy 0.18.0 is not released yet!
>
> .. contents::
>
> SciPy 0.18.0 is the culmination of 6 months of hard work. It contains
> many new features, numerous bug-fixes, improved test coverage and
> better documentation.  There have been a number of deprecations and
> API changes in this release, which are documented below.  All users
> are encouraged to upgrade to this release, as there are a large number
> of bug-fixes and optimizations.  Moreover, our development attention
> will now shift to bug-fix releases on the 0.19.x branch, and on adding
> new features on the master branch.
>
> This release requires Python 2.7 or 3.4-3.5 and NumPy 1.7.1 or greater.

Thanks a lot for taking on the release.

I put the manylinux1 and OSX wheel building into a single repo to test
64- and 32-bit linux wheels.  There's a test run with the 0.18.0rc1
code here:

https://travis-ci.org/MacPython/scipy-wheels/builds/139084454

For Python 3 I am getting these errors:
https://github.com/scipy/scipy/issues/6292

For all 32-bit builds I am getting this error:
https://github.com/scipy/scipy/issues/6093

For the Python 3 32-bit builds I am also getting this error:
https://github.com/scipy/scipy/issues/6101

For the builds that succeeded without failure (all OSX and manylinux1
for 64 bit Python 2.7), you can test with:

python -m pip install -U pip
pip install --trusted-host wheels.scipy.org -f
https://wheels.scipy.org -U --pre scipy

Thanks again, sorry for the tiring news,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Datarray 0.1.0 release

2016-06-10 Thread Matthew Brett
On Fri, Jun 10, 2016 at 1:19 PM, Stephan Hoyer <sho...@gmail.com> wrote:
> On Fri, Jun 10, 2016 at 12:51 PM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> If you like the general idea, and you don't mind the pandas
>> dependency, `xray` is a much better choice for production code right
>> now, and will do the same stuff and more:
>>
>> https://pypi.python.org/pypi/xray/0.4.1
>>
>
>
> Hi Matthew,
>
> Congrats on the release!
>
> I just wanted to point out that "xray" is now known as "xarray":
> https://pypi.python.org/pypi/xarray/

Ah - thank you - I'll update the docs...

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Datarray 0.1.0 release

2016-06-10 Thread Matthew Brett
Hi,

I just released a new version of the Datarray package:

https://pypi.python.org/pypi/datarray/0.1.0
https://github.com/BIDS/datarray

It's a very lightweight implementation of arrays with labeled axes and
ticks, that allows you to do stuff like:

>>> narr = DataArray(np.zeros((1,2,3)), axes=('a','b','c'))
>>> narr.axes.a
Axis(name='a', index=0, labels=None)
>>> narr.axes.a[0]
DataArray(array([[ 0.,  0.,  0.],
   [ 0.,  0.,  0.]]),
('b', 'c'))

It's still experimental, but we'd love to hear any feedback, in any
form.  Please feel free to make github issues for specific bugs /
suggestions:

https://github.com/BIDS/datarray/issues

If you like the general idea, and you don't mind the pandas
dependency, `xray` is a much better choice for production code right
now, and will do the same stuff and more:

https://pypi.python.org/pypi/xray/0.4.1

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Integers to integer powers, let's make a decision

2016-06-04 Thread Matthew Brett
On Sat, Jun 4, 2016 at 12:47 PM,   wrote:
>
>
> On Sat, Jun 4, 2016 at 3:43 PM, Charles R Harris 
> wrote:
>>
>>
>>
>> On Sat, Jun 4, 2016 at 11:22 AM, Charles R Harris
>>  wrote:
>>>
>>> Hi All,
>>>
>>> I've made a new post so that we can make an explicit decision. AFAICT,
>>> the two proposals are
>>>
>>> Integers to negative integer powers raise an error.
>>> Integers to integer powers always results in floats.
>>>
>>> My own sense is that 1. would be closest to current behavior and using a
>>> float exponential when a float is wanted is an explicit way to indicate that
>>> desire. OTOH, 2. would be the most convenient default for everyday numerical
>>> computation, but I think would more likely break current code. I am going to
>>> come down on the side of 1., which I don't think should cause too many
>>> problems if we start with a {Future, Deprecation}Warning explaining the
>>> workaround.
>
>
> I'm in favor of 2.  always float for `**`
> I don't see enough pure integer usecases to throw away a nice operator.

I can't make sense of 'throw away a nice operator' - you still have
arr ** 2.0 if you want floats.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Integers to integer powers, let's make a decision

2016-06-04 Thread Matthew Brett
On Sat, Jun 4, 2016 at 10:45 AM, Nathaniel Smith  wrote:
> +1
>
> On Sat, Jun 4, 2016 at 10:22 AM, Charles R Harris
>  wrote:
>> Hi All,
>>
>> I've made a new post so that we can make an explicit decision. AFAICT, the
>> two proposals are
>>
>> Integers to negative integer powers raise an error.
>> Integers to integer powers always results in floats.
>>
>> My own sense is that 1. would be closest to current behavior and using a
>> float exponential when a float is wanted is an explicit way to indicate that
>> desire. OTOH, 2. would be the most convenient default for everyday numerical
>> computation, but I think would more likely break current code. I am going to
>> come down on the side of 1., which I don't think should cause too many
>> problems if we start with a {Future, Deprecation}Warning explaining the
>> workaround.

I agree - error for negative integer powers seems like the safest option.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Integers to integer powers

2016-05-24 Thread Matthew Brett
On Tue, May 24, 2016 at 1:31 PM, Alan Isaac  wrote:
> On 5/24/2016 1:19 PM, Stephan Hoyer wrote:
>>
>> the int ** 2 example feels quite compelling to me
>
>
>
> Yes, but that one case is trivial: a*a

Right, but you'd have to know to change your code when numpy makes
this change.   Your code will suddenly have a new datatype, that you
may be passing into other code, which may generate different results,
that will be hard to track down.

> And at least as compelling is not have a**-2 fail

I imagine that's about an order of magnitude less common in real code,
but an error seems an acceptable result to me, so I know I have to do
a ** -2.0

> and not being tricked by say np.arange(10)**10.
> The latter is a promise of hidden errors.

It's a well-understood promise though - you always have to be careful
of integer overflow.

Best,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Adding the Linux Wheels for old releases breaks builds

2016-04-25 Thread Matthew Brett
Hi,

On Mon, Apr 25, 2016 at 2:25 PM, Christian Aichinger
 wrote:
> Hi!
> The addition of the Linux Wheels broke the build process of several of our 
> Debian packages, which rely on NumPy installed inside virtualenvs. The 
> problem stems from the pre-compiled shared libraries included in the Wheels, 
> details are in .
>
> I'm bringing this up here because these changes have implications that may 
> not have been fully realized before.
>
> The Wheel packages are great for end users, they make NumPy much more easily 
> installable for average people. Unfortunately, they are precisely the wrong 
> thing for anyone re-packaging NumPy (e.g. shipping it in a virtualenv inside 
> RPM or Debian packages). For that use-case, you typically want to build NumPy 
> yourself.[1] You could rely on this happening before, now a `--no-binary` 
> argument for pip is needed to get that behavior. Put another way, the 
> addition of the Wheels silently invalidated the assumption that a `pip 
> install numpy` would locally compile the package.
>
> In the perfect world, anyone re-packaging NumPy would specify `--no-binary` 
> if they want to enforce local building. However, currently, --no-binary is 
> not in widespread use because it was never necessary before.
>
> I fully agree that the Wheels have great value, but adding them for old 
> releases (back to 1.6.0 from 2011) suddenly changes the NumPy distribution 
> for people who explicitly pinned an older version to avoid surprises. It 
> invites downstream build failures (as happened to us) and adds 
> externally-built shared objects in a way that people won't expect.
>
> I would propose to only add Wheels for new releases and to explicitly mention 
> this issue in the release notes, so people are not blind-sided by it. I 
> realize that this would be a painfully slow process, but silently breaking 
> previously working use-cases for old releases seems worse to me (though it is 
> difficult to estimate how many people are negatively affected by this).

There's more discussion of this issue over on
https://github.com/numpy/numpy/issues/7570

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-22 Thread Matthew Brett
On Fri, Apr 22, 2016 at 11:27 AM, Olivier Grisel
<olivier.gri...@ensta.org> wrote:
> 2016-04-22 20:17 GMT+02:00 Matthew Brett <matthew.br...@gmail.com>:
>>
>> The github releases idea sounds intriguing.   Do you have any
>> experience with that?  Are there good examples other than the API
>> documentation?
>>
>> https://developer.github.com/v3/repos/releases/
>
> I never used it by I assume we could create a numpy-openblas repo to
> host official builds suitable for embedding numpy wheels for stable
> each releases of OpenBLAS:
>
> There is also a travis deployment target.
>
> https://docs.travis-ci.com/user/deployment/releases

Ah - thanks - that's good resource.

> I have not sure that the travis timeout is long enough to build
> openblas. I believe so but I have not tried myself yet.

Yes, the manylinux-builds repo currently builds openblas for each
entry in the build matrix, so it's easily within time:

https://travis-ci.org/matthew-brett/manylinux-builds/builds/123643313

It would be good to think of a way of supporting a set of libraries,
such as libpng, freetype, openblas.  We might need to support both
64-bit and 32-bit versions as well.  Then, some automated build script
would by default pick up the latest of these for numpy, matplotlib
etc.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-22 Thread Matthew Brett
On Thu, Apr 21, 2016 at 1:47 AM, Olivier Grisel
<olivier.gri...@ensta.org> wrote:
> 2016-04-20 16:57 GMT+02:00 Matthew Brett <matthew.br...@gmail.com>:
>> On Wed, Apr 20, 2016 at 1:59 AM, Olivier Grisel
>> <olivier.gri...@ensta.org> wrote:
>>> Thanks,
>>>
>>> I think next we could upgrade the travis configuration of numpy and
>>> scipy to build and upload manylinux1 wheels to
>>> http://travis-dev-wheels.scipy.org/ for downstream project to test
>>> against the master branch of numpy and scipy whithout having to build
>>> those from source.
>>>
>>> However that would require publishing an official pre-built
>>> libopenblas.so (+headers) archive or RPM package. That archive would
>>> server as the reference libary to build scipy stack manylinux1 wheels.
>>
>> There's an OpenBLAS archive up at :
>> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/openblas_0.2.18.tgz
>
> Thanks.
>
>> - is that the right place for it?  It gets uploaded by the
>> manylinux-builds travis run.
>
> The only problem with rackspace cloud files is that as of now there is
> no way to put a short domain name (CNAME) with https. Maybe we could
> use the github "release" system on a github repo under the numpy
> github organization to host it. Or alternatively use an external
> binary file host that use github credentials for upload rigths, for
> instance bintray (I have no experience with this yet though).

The github releases idea sounds intriguing.   Do you have any
experience with that?  Are there good examples other than the API
documentation?

https://developer.github.com/v3/repos/releases/

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-20 Thread Matthew Brett
Hi,

On Wed, Apr 20, 2016 at 3:33 AM, Jens Nielsen  wrote:
> Thanks
>
> I can confirm that the new narrow unicode build wheels of Scipy works as
> expected for my project.
> @Oliver Grisel Thanks for finding the Travis issue it's probably worth
> considering switching the Travis build to 2.7.11 to avoid other similar
> issues.
>
> The old versions of numpy are very handy for downstream testing. I have
> verified that they work as expected in the Matplotlib tests here:
> https://github.com/jenshnielsen/matplotlib/tree/travisnowheelhouse where we
> are testing against numpy 1.6 as the earliest. This branch switches
> matplotlib from the scikit image wheelhouse to manylinux wheels which seems
> to work great.

Jens - any interest in working together on a good matplotlib build recipe?

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-20 Thread Matthew Brett
On Wed, Apr 20, 2016 at 1:59 AM, Olivier Grisel
 wrote:
> Thanks,
>
> I think next we could upgrade the travis configuration of numpy and
> scipy to build and upload manylinux1 wheels to
> http://travis-dev-wheels.scipy.org/ for downstream project to test
> against the master branch of numpy and scipy whithout having to build
> those from source.
>
> However that would require publishing an official pre-built
> libopenblas.so (+headers) archive or RPM package. That archive would
> server as the reference libary to build scipy stack manylinux1 wheels.

There's an OpenBLAS archive up at :
http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/openblas_0.2.18.tgz
- is that the right place for it?  It gets uploaded by the
manylinux-builds travis run.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-19 Thread Matthew Brett
On Tue, Apr 19, 2016 at 1:12 AM, Olivier Grisel
 wrote:
> I think that would be very useful, e.g. for downstream projects to
> check that they work properly with old versions using a simple pip
> install command on their CI workers.

Done for numpy 1.6.0 through 1.10.4, scipy 0.9 through scipy 0.16.1

Please let me know of any problems,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-19 Thread Matthew Brett
Hi,

On Mon, Apr 18, 2016 at 2:49 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> On Sun, Apr 17, 2016 at 9:48 AM, Jens Nielsen <jenshniel...@gmail.com> wrote:
>> I have tested the new cp27m wheels and they seem to work great.
>>
>> @Matthew I am using the:
>>
>> ```
>> sudo: required
>> dist: trusty
>>
>> images mentioned here https://docs.travis-ci.com/user/ci-environment/. As
>> far as I can see you are doing:
>> sudo: false
>> dist: trusty
>>
>> I had no idea such an image exist since it's not documented on
>> https://docs.travis-ci.com/user/ci-environment/
>>
>> Anyway your tests runs with python 2.7.9 where as the sudo: requires ships
>> python 2.7.10 so it's clearly a different python version:
>>
>> @Olivier Grisel this only applies to Travis's own home build versions of
>> python 2.7 on the Trusty running on google compute engine.
>> It ships it's own prebuild python version. I don't have any issues with the
>> stock versions on Ubuntu which pip tells me are indeed cp27mu.
>>
>> It seems like the new cp27m wheels works as expected. Thanks a lot
>> Doing:
>>
>> ```
>> python -c "from pip import pep425tags;
>> print(pep425tags.is_manylinux1_compatible());
>> print(pep425tags.have_compatible_glibc(2, 5));
>> print(pep425tags.get_abi_tag())"
>> pip install --timeout=60 --no-index --trusted-host
>> "ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com"
>> --find-links
>> "http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/;
>> numpy scipy --upgrade
>> ```
>> results in:
>>
>> ```
>> True
>> True
>> cp27m
>> Ignoring indexes: https://pypi.python.org/simple
>> Collecting numpy
>>   Downloading
>> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/numpy-1.11.0-cp27-cp27m-manylinux1_x86_64.whl
>> (15.3MB)
>> 100% || 15.3MB 49.0MB/s
>> Collecting scipy
>>   Downloading
>> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/scipy-0.17.0-cp27-cp27m-manylinux1_x86_64.whl
>> (39.5MB)
>> 100% || 39.5MB 21.1MB/s
>> Installing collected packages: numpy, scipy
>>   Found existing installation: numpy 1.10.1
>> Uninstalling numpy-1.10.1:
>>   Successfully uninstalled numpy-1.10.1
>> Successfully installed numpy-1.11.0 scipy-0.17.0
>> ```
>> And all my tests pass as expected.
>
> Thanks for testing.

I've also tested a range of numpy and scipy wheels built with the
manylinux docker image.

Built numpy and scipy wheels here:

http://nipy.bic.berkeley.edu/manylinux/

Test script and output here:

http://nipy.bic.berkeley.edu/manylinux/tests/

There are some test failures in the logs there, but I think they are
all known failures from old numpy / scipy versions, particularly

https://github.com/scipy/scipy/issues/5370

Y'all can test for yourselves with something like:

python -m pip install -U pip
pip install -f https://nipy.bic.berkeley.edu/manylinux numpy==1.6.2
scipy==0.16.0

I propose to upload these historical wheels to pypi to make it easier
to test against older versions of numpy / scipy.

Any objections?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-18 Thread Matthew Brett
On Sun, Apr 17, 2016 at 9:48 AM, Jens Nielsen  wrote:
> I have tested the new cp27m wheels and they seem to work great.
>
> @Matthew I am using the:
>
> ```
> sudo: required
> dist: trusty
>
> images mentioned here https://docs.travis-ci.com/user/ci-environment/. As
> far as I can see you are doing:
> sudo: false
> dist: trusty
>
> I had no idea such an image exist since it's not documented on
> https://docs.travis-ci.com/user/ci-environment/
>
> Anyway your tests runs with python 2.7.9 where as the sudo: requires ships
> python 2.7.10 so it's clearly a different python version:
>
> @Olivier Grisel this only applies to Travis's own home build versions of
> python 2.7 on the Trusty running on google compute engine.
> It ships it's own prebuild python version. I don't have any issues with the
> stock versions on Ubuntu which pip tells me are indeed cp27mu.
>
> It seems like the new cp27m wheels works as expected. Thanks a lot
> Doing:
>
> ```
> python -c "from pip import pep425tags;
> print(pep425tags.is_manylinux1_compatible());
> print(pep425tags.have_compatible_glibc(2, 5));
> print(pep425tags.get_abi_tag())"
> pip install --timeout=60 --no-index --trusted-host
> "ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com"
> --find-links
> "http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/;
> numpy scipy --upgrade
> ```
> results in:
>
> ```
> True
> True
> cp27m
> Ignoring indexes: https://pypi.python.org/simple
> Collecting numpy
>   Downloading
> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/numpy-1.11.0-cp27-cp27m-manylinux1_x86_64.whl
> (15.3MB)
> 100% || 15.3MB 49.0MB/s
> Collecting scipy
>   Downloading
> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/scipy-0.17.0-cp27-cp27m-manylinux1_x86_64.whl
> (39.5MB)
> 100% || 39.5MB 21.1MB/s
> Installing collected packages: numpy, scipy
>   Found existing installation: numpy 1.10.1
> Uninstalling numpy-1.10.1:
>   Successfully uninstalled numpy-1.10.1
> Successfully installed numpy-1.11.0 scipy-0.17.0
> ```
> And all my tests pass as expected.

Thanks for testing.

I set up a buildbot test to run against a narrow unicode build of Python:

http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian-narrow/builds/1

All tests pass for me too, so I've done the pypi upload for the narrow
unicode numpy, scipy, cython wheels.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-16 Thread Matthew Brett
On Sat, Apr 16, 2016 at 8:02 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> Hi,
>
> On Thu, Apr 14, 2016 at 11:04 AM, Matthew Brett <matthew.br...@gmail.com> 
> wrote:
>> Hi,
>>
>> On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshniel...@gmail.com> wrote:
>>> I have tried testing the wheels in a project that runs tests on Travis's
>>> Trusty infrastructure which. The wheels work great for python 3.5 and saves
>>> us several minuts of runtime.
>>>
>>> However, I am having trouble using the wheels on python 2.7 on the same
>>> Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu
>>> (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as
>>> pip.pep425tags.get_abi_tag() returns cp27m on this particular python
>>> version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a
>>> cp27m compatible wheel build?
>>
>> Ouch - do you know where travis-ci's Python 2.7 comes from?I see
>> that the standard apt-get install -y python is a wide (mu) build...
>
> I built some narrow unicode builds
> (numpy-1.11.0-cp27-cp27m-manylinux1_x86_64.whl etc) here:
>
> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/
>
> Would you mind testing them to see if they work on travis-ci?

I tried testing on trusty with travis-ci, but it appears to pick up
the mu builds as on precise...

https://travis-ci.org/matthew-brett/manylinux-testing/jobs/123652670#L161

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-16 Thread Matthew Brett
Hi,

On Thu, Apr 14, 2016 at 11:04 AM, Matthew Brett <matthew.br...@gmail.com> wrote:
> Hi,
>
> On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshniel...@gmail.com> wrote:
>> I have tried testing the wheels in a project that runs tests on Travis's
>> Trusty infrastructure which. The wheels work great for python 3.5 and saves
>> us several minuts of runtime.
>>
>> However, I am having trouble using the wheels on python 2.7 on the same
>> Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu
>> (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as
>> pip.pep425tags.get_abi_tag() returns cp27m on this particular python
>> version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a
>> cp27m compatible wheel build?
>
> Ouch - do you know where travis-ci's Python 2.7 comes from?I see
> that the standard apt-get install -y python is a wide (mu) build...

I built some narrow unicode builds
(numpy-1.11.0-cp27-cp27m-manylinux1_x86_64.whl etc) here:

http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/

Would you mind testing them to see if they work on travis-ci?

Thanks,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-14 Thread Matthew Brett
On Thu, Apr 14, 2016 at 1:47 PM, Jonathan Helmus <jjhel...@gmail.com> wrote:
>
>
> On 4/14/16 3:11 PM, Matthew Brett wrote:
>>
>> On Thu, Apr 14, 2016 at 12:57 PM, Matthew Brett <matthew.br...@gmail.com>
>> wrote:
>>>
>>> On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhel...@gmail.com>
>>> wrote:
>>>>
>>>>
>>>> On 4/14/16 1:26 PM, Matthew Brett wrote:
>>>>>
>>>>> Hi,
>>>>>
>>>>> On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.r...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> Are we going to have to have documentation somewhere making it clear
>>>>>> that
>>>>>> the numpy wheel shouldn't be used in a conda environment? Not that I
>>>>>> would
>>>>>> expect this issue to come up all that often, but I could imagine a
>>>>>> scenario
>>>>>> where a non-scientist is simply using a base conda distribution
>>>>>> because
>>>>>> that
>>>>>> is what IT put on their system. Then they do "pip install ipython"
>>>>>> that
>>>>>> indirectly brings in numpy (through the matplotlib dependency), and
>>>>>> end
>>>>>> up
>>>>>> with an incompatible numpy because they would have been linked against
>>>>>> different pythons?
>>>>>>
>>>>>> Or is this not an issue?
>>>>>
>>>>> I'm afraid I don't know conda at all, but I'm guessing that pip will
>>>>> not install numpy when it is installed via conda.
>>>>
>>>> Correct, pip will not (or at least should not, and did not in my tests)
>>>> install numpy over top of an existing conda installed numpy.
>>>> Unfortunately
>>>> from my testing, conda will install a conda version of numpy over top of
>>>> a
>>>> pip installed version.  This may be the expected behavior as conda
>>>> maintains
>>>> its own list of installed packages.
>>>>>
>>>>> So the potential difference is that, pre-wheel, if numpy was not
>>>>> installed in your conda environment, then pip would build numpy from
>>>>> source, whereas now you'll get a binary install.
>>>>>
>>>>> I _think_ that Python's binary API specification
>>>>> (pip.pep425tags.get_abi_tag()) should prevent pip from installing an
>>>>> incompatible wheel.  Are there any conda experts out there who can
>>>>> give more detail, or more convincing assurance?
>>>>
>>>> I tested "pip install numpy" in conda environments (conda's equivalent
>>>> to
>>>> virtualenvs) which did not have numpy installed previously for Python
>>>> 2.7,
>>>> 3.4 and 3.5 in a Ubuntu 14.04 Docker container.  In all cases numpy was
>>>> installed from the whl file and appeared to be functional.  Running the
>>>> numpy test suite found three failing tests for Python 2.7 and 3.5 and 21
>>>> errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning
>>>> but
>>>> the 3.4 errors are a bit strange.
>>>> Logs are in
>>>> https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36
>>>
>>> Thanks for testing.  For:
>>>
>>> docker run -ti --rm ubuntu:14.04 /bin/bash
>>>
>>> apt-get update && apt-get install -y curl
>>> curl -LO https://bootstrap.pypa.io/get-pip.py
>>> python3 get-pip.py
>>> pip install numpy nose
>>> python3 -c "import numpy; numpy.test()"
>>>
>>> I get:
>>>
>>> FAILED (KNOWNFAIL=7, SKIP=17, errors=21)
>>>
>>> This is stock Python 3.4 - so not a conda issue.  It is definitely a
>>> problem with the wheel because a compiled numpy wheel on the same
>>> docker image:
>>>
>>> apt-get update && apt-get install -y curl python3-dev
>>> curl -LO https://bootstrap.pypa.io/get-pip.py
>>> python3 get-pip.py
>>> pip install --no-binary=:all: numpy nose
>>> python3 -c "import numpy; numpy.test()"
>>>
>>> gives no test errors.
>>>
>>> It looks like we have some more work to do...
>>
>> Actually, I can solve these errors by first doing:
>>
>> apt-get install gcc
>>
>> I think these must be bugs in the numpy tests where numpy is assuming
>> a functional compiler.
>>
>> Does the conda numpy give test errors when there is no compiler?
>>
>> Cheers,
>>
>> Matthew
>
>
> Yes, both the wheel and conda numpy packages give errors when there is not a
> compiler.  These errors clear when gcc is installed.  Looks like the wheels
> are fine, just forgot about a compiler.

Thanks for checking.  I think the problem is fixed here:

https://github.com/numpy/numpy/pull/7549

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-14 Thread Matthew Brett
On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen  wrote:
> I have tried testing the wheels in a project that runs tests on Travis's
> Trusty infrastructure which. The wheels work great for python 3.5 and saves
> us several minuts of runtime.
>
> However, I am having trouble using the wheels on python 2.7 on the same
> Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu
> (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as
> pip.pep425tags.get_abi_tag() returns cp27m on this particular python
> version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a
> cp27m compatible wheel build?

Nathaniel / other pip experts - I can't remember the history of these tags.

Is there any danger that an older pip will install a cp27m wheel on a
cp27mu system?

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-14 Thread Matthew Brett
On Thu, Apr 14, 2016 at 12:57 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhel...@gmail.com> wrote:
>>
>>
>> On 4/14/16 1:26 PM, Matthew Brett wrote:
>>>
>>> Hi,
>>>
>>> On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.r...@gmail.com>
>>> wrote:
>>>>
>>>> Are we going to have to have documentation somewhere making it clear that
>>>> the numpy wheel shouldn't be used in a conda environment? Not that I
>>>> would
>>>> expect this issue to come up all that often, but I could imagine a
>>>> scenario
>>>> where a non-scientist is simply using a base conda distribution because
>>>> that
>>>> is what IT put on their system. Then they do "pip install ipython" that
>>>> indirectly brings in numpy (through the matplotlib dependency), and end
>>>> up
>>>> with an incompatible numpy because they would have been linked against
>>>> different pythons?
>>>>
>>>> Or is this not an issue?
>>>
>>> I'm afraid I don't know conda at all, but I'm guessing that pip will
>>> not install numpy when it is installed via conda.
>>
>> Correct, pip will not (or at least should not, and did not in my tests)
>> install numpy over top of an existing conda installed numpy. Unfortunately
>> from my testing, conda will install a conda version of numpy over top of a
>> pip installed version.  This may be the expected behavior as conda maintains
>> its own list of installed packages.
>>>
>>> So the potential difference is that, pre-wheel, if numpy was not
>>> installed in your conda environment, then pip would build numpy from
>>> source, whereas now you'll get a binary install.
>>>
>>> I _think_ that Python's binary API specification
>>> (pip.pep425tags.get_abi_tag()) should prevent pip from installing an
>>> incompatible wheel.  Are there any conda experts out there who can
>>> give more detail, or more convincing assurance?
>>
>> I tested "pip install numpy" in conda environments (conda's equivalent to
>> virtualenvs) which did not have numpy installed previously for Python 2.7,
>> 3.4 and 3.5 in a Ubuntu 14.04 Docker container.  In all cases numpy was
>> installed from the whl file and appeared to be functional.  Running the
>> numpy test suite found three failing tests for Python 2.7 and 3.5 and 21
>> errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but
>> the 3.4 errors are a bit strange.
>> Logs are in
>> https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36
>
> Thanks for testing.  For:
>
> docker run -ti --rm ubuntu:14.04 /bin/bash
>
> apt-get update && apt-get install -y curl
> curl -LO https://bootstrap.pypa.io/get-pip.py
> python3 get-pip.py
> pip install numpy nose
> python3 -c "import numpy; numpy.test()"
>
> I get:
>
> FAILED (KNOWNFAIL=7, SKIP=17, errors=21)
>
> This is stock Python 3.4 - so not a conda issue.  It is definitely a
> problem with the wheel because a compiled numpy wheel on the same
> docker image:
>
> apt-get update && apt-get install -y curl python3-dev
> curl -LO https://bootstrap.pypa.io/get-pip.py
> python3 get-pip.py
> pip install --no-binary=:all: numpy nose
> python3 -c "import numpy; numpy.test()"
>
> gives no test errors.
>
> It looks like we have some more work to do...

Actually, I can solve these errors by first doing:

apt-get install gcc

I think these must be bugs in the numpy tests where numpy is assuming
a functional compiler.

Does the conda numpy give test errors when there is no compiler?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-14 Thread Matthew Brett
On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhel...@gmail.com> wrote:
>
>
> On 4/14/16 1:26 PM, Matthew Brett wrote:
>>
>> Hi,
>>
>> On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.r...@gmail.com>
>> wrote:
>>>
>>> Are we going to have to have documentation somewhere making it clear that
>>> the numpy wheel shouldn't be used in a conda environment? Not that I
>>> would
>>> expect this issue to come up all that often, but I could imagine a
>>> scenario
>>> where a non-scientist is simply using a base conda distribution because
>>> that
>>> is what IT put on their system. Then they do "pip install ipython" that
>>> indirectly brings in numpy (through the matplotlib dependency), and end
>>> up
>>> with an incompatible numpy because they would have been linked against
>>> different pythons?
>>>
>>> Or is this not an issue?
>>
>> I'm afraid I don't know conda at all, but I'm guessing that pip will
>> not install numpy when it is installed via conda.
>
> Correct, pip will not (or at least should not, and did not in my tests)
> install numpy over top of an existing conda installed numpy. Unfortunately
> from my testing, conda will install a conda version of numpy over top of a
> pip installed version.  This may be the expected behavior as conda maintains
> its own list of installed packages.
>>
>> So the potential difference is that, pre-wheel, if numpy was not
>> installed in your conda environment, then pip would build numpy from
>> source, whereas now you'll get a binary install.
>>
>> I _think_ that Python's binary API specification
>> (pip.pep425tags.get_abi_tag()) should prevent pip from installing an
>> incompatible wheel.  Are there any conda experts out there who can
>> give more detail, or more convincing assurance?
>
> I tested "pip install numpy" in conda environments (conda's equivalent to
> virtualenvs) which did not have numpy installed previously for Python 2.7,
> 3.4 and 3.5 in a Ubuntu 14.04 Docker container.  In all cases numpy was
> installed from the whl file and appeared to be functional.  Running the
> numpy test suite found three failing tests for Python 2.7 and 3.5 and 21
> errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but
> the 3.4 errors are a bit strange.
> Logs are in
> https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36

Thanks for testing.  For:

docker run -ti --rm ubuntu:14.04 /bin/bash

apt-get update && apt-get install -y curl
curl -LO https://bootstrap.pypa.io/get-pip.py
python3 get-pip.py
pip install numpy nose
python3 -c "import numpy; numpy.test()"

I get:

FAILED (KNOWNFAIL=7, SKIP=17, errors=21)

This is stock Python 3.4 - so not a conda issue.  It is definitely a
problem with the wheel because a compiled numpy wheel on the same
docker image:

apt-get update && apt-get install -y curl python3-dev
curl -LO https://bootstrap.pypa.io/get-pip.py
python3 get-pip.py
pip install --no-binary=:all: numpy nose
python3 -c "import numpy; numpy.test()"

gives no test errors.

It looks like we have some more work to do...

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-14 Thread Matthew Brett
Hi,

On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root  wrote:
> Are we going to have to have documentation somewhere making it clear that
> the numpy wheel shouldn't be used in a conda environment? Not that I would
> expect this issue to come up all that often, but I could imagine a scenario
> where a non-scientist is simply using a base conda distribution because that
> is what IT put on their system. Then they do "pip install ipython" that
> indirectly brings in numpy (through the matplotlib dependency), and end up
> with an incompatible numpy because they would have been linked against
> different pythons?
>
> Or is this not an issue?

I'm afraid I don't know conda at all, but I'm guessing that pip will
not install numpy when it is installed via conda.

So the potential difference is that, pre-wheel, if numpy was not
installed in your conda environment, then pip would build numpy from
source, whereas now you'll get a binary install.

I _think_ that Python's binary API specification
(pip.pep425tags.get_abi_tag()) should prevent pip from installing an
incompatible wheel.  Are there any conda experts out there who can
give more detail, or more convincing assurance?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-14 Thread Matthew Brett
Hi,

On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen  wrote:
> I have tried testing the wheels in a project that runs tests on Travis's
> Trusty infrastructure which. The wheels work great for python 3.5 and saves
> us several minuts of runtime.
>
> However, I am having trouble using the wheels on python 2.7 on the same
> Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu
> (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as
> pip.pep425tags.get_abi_tag() returns cp27m on this particular python
> version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a
> cp27m compatible wheel build?

Ouch - do you know where travis-ci's Python 2.7 comes from?I see
that the standard apt-get install -y python is a wide (mu) build...

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-13 Thread Matthew Brett
On Wed, Apr 13, 2016 at 1:29 PM, Oscar Benjamin
<oscar.j.benja...@gmail.com> wrote:
> On 13 April 2016 at 20:15, Matthew Brett <matthew.br...@gmail.com> wrote:
>> Done.  If y'all are on linux, and you have pip >= 8.11,  you should
>> now see this kind of thing:
>
> That's fantastic. Thanks Matt!
>
> I just test installed this and ran numpy.test(). All tests passed but
> then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C
> at the prompt:
>
> $ python
> Python 2.7.9 (default, Apr  2 2015, 15:33:21)
> [GCC 4.9.2] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
>>>> import numpy
>>>> numpy.test()
> Running unit tests for numpy
> 
> Ran 5781 tests in 72.238s
>
> OK (KNOWNFAIL=6, SKIP=15)
> 
>>>> Segmentation fault (core dumped)
>
> It was stopped at the prompt and then I did Ctrl-C and then the
> seg-fault message.
>
> $ uname -a
> Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC
> 2015 x86_64 x86_64 x86_64 GNU/Linux
> $ lsb_release -a
> No LSB modules are available.
> Distributor ID:Ubuntu
> Description:Ubuntu 15.04
> Release:15.04
> Codename:vivid
>

Thanks so much for testing - that's very useful.

I get the same thing on my Debian Sid machine.

Actually I also get the same thing with a local compile against Debian
ATLAS, here's the stack trace after:

>>> import numpy; numpy.test()
>>> # Ctrl-C

https://gist.github.com/f6d8fb42f24689b39536a2416d717056

Do you get this as well?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-13 Thread Matthew Brett
On Tue, Apr 12, 2016 at 7:15 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> Hi,
>
> On Sat, Apr 2, 2016 at 6:11 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
>> On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.c...@googlemail.com> 
>> wrote:
>>> On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgi...@gmail.com> 
>>> wrote:
>>>> I suspect that many of the maintainers of major scipy-ecosystem projects 
>>>> are
>>>> aware of these (or other similar) travis wheel caches, but would guess that
>>>> the pool of travis-ci python users who weren't aware of these wheel caches
>>>> is much much larger. So there will still be a lot of travis-ci clock cycles
>>>> saved by manylinux wheels.
>>>>
>>>> -Robert
>>>
>>> Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely 
>>> something
>>> I would suggest adding to the release notes. Hopefully this will help 
>>> trigger
>>> a general availability of wheels in the numpy-ecosystem :)
>>>
>>> In the case of Travis CI, their VM images for Python already have a version
>>> of NumPy installed, but having the latest version of NumPy and SciPy etc
>>> available as Linux wheels would be very nice.
>>
>> We're very nearly there now.
>>
>> The latest versions of numpy, scipy, scikit-image, pandas, numexpr,
>> statsmodels wheels for testing at
>> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/
>>
>> Please do test with:
>>
>> python -m pip install --upgrade pip
>>
>> pip install 
>> --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
>> --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
>> numpy scipy scikit-learn numexpr
>>
>> python -c 'import numpy; numpy.test("full")'
>> python -c 'import scipy; scipy.test("full")'
>>
>> We would love to get any feedback as to whether these work on your machines.
>
> I've just rebuilt these wheels with the just-released OpenBLAS 0.2.18.
>
> OpenBLAS is now passing all its own tests and tests on numpy / scipy /
> scikit-learn at http://build.openblas.net/builders
>
> Our tests of the wheels look good too:
>
> http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian
> http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian
> https://travis-ci.org/matthew-brett/manylinux-testing
>
> So I think these are ready to go.  I propose uploading these wheels
> for numpy and scipy to pypi tomorrow unless anyone has an objection.

Done.  If y'all are on linux, and you have pip >= 8.11,  you should
now see this kind of thing:

$ pip install numpy scipy
Collecting numpy
  Downloading numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl (15.3MB)
100% || 15.3MB 61kB/s
Collecting scipy
  Downloading scipy-0.17.0-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB)
100% || 39.5MB 24kB/s
Installing collected packages: numpy, scipy
Successfully installed numpy-1.11.0 scipy-0.17.0

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-12 Thread Matthew Brett
Hi,

On Sat, Apr 2, 2016 at 6:11 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.c...@googlemail.com> wrote:
>> On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgi...@gmail.com> 
>> wrote:
>>> I suspect that many of the maintainers of major scipy-ecosystem projects are
>>> aware of these (or other similar) travis wheel caches, but would guess that
>>> the pool of travis-ci python users who weren't aware of these wheel caches
>>> is much much larger. So there will still be a lot of travis-ci clock cycles
>>> saved by manylinux wheels.
>>>
>>> -Robert
>>
>> Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely 
>> something
>> I would suggest adding to the release notes. Hopefully this will help trigger
>> a general availability of wheels in the numpy-ecosystem :)
>>
>> In the case of Travis CI, their VM images for Python already have a version
>> of NumPy installed, but having the latest version of NumPy and SciPy etc
>> available as Linux wheels would be very nice.
>
> We're very nearly there now.
>
> The latest versions of numpy, scipy, scikit-image, pandas, numexpr,
> statsmodels wheels for testing at
> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/
>
> Please do test with:
>
> python -m pip install --upgrade pip
>
> pip install 
> --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
> --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
> numpy scipy scikit-learn numexpr
>
> python -c 'import numpy; numpy.test("full")'
> python -c 'import scipy; scipy.test("full")'
>
> We would love to get any feedback as to whether these work on your machines.

I've just rebuilt these wheels with the just-released OpenBLAS 0.2.18.

OpenBLAS is now passing all its own tests and tests on numpy / scipy /
scikit-learn at http://build.openblas.net/builders

Our tests of the wheels look good too:

http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian
http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian
https://travis-ci.org/matthew-brett/manylinux-testing

So I think these are ready to go.  I propose uploading these wheels
for numpy and scipy to pypi tomorrow unless anyone has an objection.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Using OpenBLAS for manylinux wheels

2016-04-12 Thread Matthew Brett
On Wed, Apr 6, 2016 at 6:47 AM, Olivier Grisel  wrote:
> I updated the issue:
>
> https://github.com/xianyi/OpenBLAS-CI/issues/10#issuecomment-206195714
>
> The random test_nanmedian_all_axis failure is unrelated to openblas
> and should be ignored.

It looks like all is well now, at least for the OpenBLAS buildbots :
http://build.openblas.net/builders

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] ndarray.T2 for 2D transpose

2016-04-07 Thread Matthew Brett
On Thu, Apr 7, 2016 at 11:17 AM, Chris Barker  wrote:
> On Thu, Apr 7, 2016 at 10:00 AM, Ian Henriksen
>  wrote:
>>
>> Here's another example that I've seen catch people now and again.
>>
>> A = np.random.rand(100, 100)
>> b =  np.random.rand(10)
>> A * b.T
>
>
> typo? that was supposed to be
>
> b =  np.random.rand(100). yes?
>
> This is exactly what someone else referred to as the expectations of someone
> that comes from MATLAB, and doesn't yet "get" that 1D arrays are 1D arrays.
>
> All of this is EXACTLY the motivation for the matric class -- which never
> took off, and was never complete (it needed a row and column vector
> implementation, if you ask me. But Ithikn the reason it didn't take off is
> that it really isn't that useful, but is different enough from regular
> arrays to be a greater source of confusion. And it was decided that all
> people REALLY wanted was an obviou sway to get matric multiply, which we now
> have with @.
>
> So this discussion brings up that we also need an easy an obvious way to
> make a column vector --
>
> maybe:
>
> np.col_vector(arr)
>
> which would be a synonym for np.reshape(arr, (-1,1))

Yes, I was going to suggest `colvec` and `rowvec`.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Using OpenBLAS for manylinux wheels

2016-04-05 Thread Matthew Brett
On Mon, Mar 28, 2016 at 2:33 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> Hi,
>
> Olivier Grisel and I are working on building and testing manylinux
> wheels for numpy and scipy.
>
> We first thought that we should use ATLAS BLAS, but Olivier found that
> my build of these could be very slow [1].  I set up a testing grid [2]
> which found test errors for numpy and scipy using ATLAS wheels.
>
> On the other hand, the same testing grid finds no errors or failures
> [3] using latest OpenBLAS (0.2.17) and running tests for:
>
> numpy
> scipy
> scikit-learn
> numexpr
> pandas
> statsmodels
>
> This is on the travis-ci ubuntu VMs.
>
> Please do test on your own machines with something like this script [4]:
>
> source test_manylinux.sh
>
> We have worried in the past about the reliability of OpenBLAS, but I
> find these tests reassuring.
>
> Are there any other tests of OpenBLAS that we should run to assure
> ourselves that it is safe to use?

Here is an update on progress:

We've now done a lot of testing on the Linux OpenBLAS wheels.   They
pass all tests on Linux, with Intel kernels:

https://travis-ci.org/matthew-brett/manylinux-testing/builds/120825485
http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian/builds/22
http://nipy.bic.berkeley.edu/builders/manylinux-2.7-fedora/builds/10

Xianyi, the maintainer of OpenBLAS, is very helpfully running the
OpenBLAS buildbot nightly tests with numpy and scipy:

http://build.openblas.net/builders

There is still one BLAS-related failure on these tests on AMD chips:

https://github.com/xianyi/OpenBLAS-CI/issues/10

I propose to hold off distributing the OpenBLAS wheels until the
OpenBLAS tests are clean on the OpenBLAS buildbots - any objections?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-04-04 Thread Matthew Brett
Hi,

On Mon, Apr 4, 2016 at 9:02 AM, Peter Cock <p.j.a.c...@googlemail.com> wrote:
> On Sun, Apr 3, 2016 at 2:11 AM, Matthew Brett <matthew.br...@gmail.com> wrote:
>> On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.c...@googlemail.com> 
>> wrote:
>>> On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgi...@gmail.com> 
>>> wrote:
>>>> I suspect that many of the maintainers of major scipy-ecosystem projects 
>>>> are
>>>> aware of these (or other similar) travis wheel caches, but would guess that
>>>> the pool of travis-ci python users who weren't aware of these wheel caches
>>>> is much much larger. So there will still be a lot of travis-ci clock cycles
>>>> saved by manylinux wheels.
>>>>
>>>> -Robert
>>>
>>> Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely 
>>> something
>>> I would suggest adding to the release notes. Hopefully this will help 
>>> trigger
>>> a general availability of wheels in the numpy-ecosystem :)
>>>
>>> In the case of Travis CI, their VM images for Python already have a version
>>> of NumPy installed, but having the latest version of NumPy and SciPy etc
>>> available as Linux wheels would be very nice.
>>
>> We're very nearly there now.
>>
>> The latest versions of numpy, scipy, scikit-image, pandas, numexpr,
>> statsmodels wheels for testing at
>> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/
>>
>> Please do test with:
>> ...
>>
>> We would love to get any feedback as to whether these work on your machines.
>
> Hi Matthew,
>
> Testing on a 64bit CentOS 6 machine with Python 3.5 compiled
> from source under my home directory:
>
>
> $ python3.5 -m pip install --upgrade pip
> Requirement already up-to-date: pip in ./lib/python3.5/site-packages
>
> $ python3.5 -m pip install
> --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
> --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
> numpy scipy
> Requirement already satisfied (use --upgrade to upgrade): numpy in
> ./lib/python3.5/site-packages
> Requirement already satisfied (use --upgrade to upgrade): scipy in
> ./lib/python3.5/site-packages
>
> $ python3.5 -m pip install
> --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
> --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
> numpy scipy --upgrade
> Collecting numpy
>   Downloading 
> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/numpy-1.11.0-cp35-cp35m-manylinux1_x86_64.whl
> (15.5MB)
> 100% || 15.5MB 42.1MB/s
> Collecting scipy
>   Downloading 
> http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/scipy-0.17.0-cp35-cp35m-manylinux1_x86_64.whl
> (40.8MB)
> 100% || 40.8MB 53.6MB/s
> Installing collected packages: numpy, scipy
>   Found existing installation: numpy 1.10.4
> Uninstalling numpy-1.10.4:
>   Successfully uninstalled numpy-1.10.4
>   Found existing installation: scipy 0.16.0
> Uninstalling scipy-0.16.0:
>   Successfully uninstalled scipy-0.16.0
> Successfully installed numpy-1.11.0 scipy-0.17.0
>
>
> $ python3.5 -c 'import numpy; numpy.test("full")'
> Running unit tests for numpy
> NumPy version 1.11.0
> NumPy relaxed strides checking option: False
> NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy
> Python version 3.5.0 (default, Sep 28 2015

Re: [Numpy-discussion] linux wheels coming soon

2016-04-02 Thread Matthew Brett
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock  wrote:
> On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon  
> wrote:
>> I suspect that many of the maintainers of major scipy-ecosystem projects are
>> aware of these (or other similar) travis wheel caches, but would guess that
>> the pool of travis-ci python users who weren't aware of these wheel caches
>> is much much larger. So there will still be a lot of travis-ci clock cycles
>> saved by manylinux wheels.
>>
>> -Robert
>
> Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely 
> something
> I would suggest adding to the release notes. Hopefully this will help trigger
> a general availability of wheels in the numpy-ecosystem :)
>
> In the case of Travis CI, their VM images for Python already have a version
> of NumPy installed, but having the latest version of NumPy and SciPy etc
> available as Linux wheels would be very nice.

We're very nearly there now.

The latest versions of numpy, scipy, scikit-image, pandas, numexpr,
statsmodels wheels for testing at
http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com/

Please do test with:

python -m install --upgrade pip

pip install 
--trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
--find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
numpy scipy scikit-learn numexpr

python -c 'import numpy; numpy.test("full")'
python -c 'import scipy; scipy.test("full")'

We would love to get any feedback as to whether these work on your machines.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Using OpenBLAS for manylinux wheels

2016-03-30 Thread Matthew Brett
On Wed, Mar 30, 2016 at 2:39 PM, Carl Kleffner  wrote:
> I would like to see OpenBLAS support for numpy on windows. The latest
> OpenBLAS windows builds numpy support for are on
> https://bitbucket.org/carlkl/mingw-w64-for-python/downloads now. Scipy
> wheels should work regardless if numpy was build with MSVC or with mingwpy.
> It is only mandantory to agree about the BLAS implemenation used.

How do the tests look for `numpy.test("full")` and `scipy.test("full")
with OpenBLAS 0.2.17 on Windows?

Sorry if you said before.

Thanks,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Using OpenBLAS for manylinux wheels

2016-03-28 Thread Matthew Brett
Hi,

Olivier Grisel and I are working on building and testing manylinux
wheels for numpy and scipy.

We first thought that we should use ATLAS BLAS, but Olivier found that
my build of these could be very slow [1].  I set up a testing grid [2]
which found test errors for numpy and scipy using ATLAS wheels.

On the other hand, the same testing grid finds no errors or failures
[3] using latest OpenBLAS (0.2.17) and running tests for:

numpy
scipy
scikit-learn
numexpr
pandas
statsmodels

This is on the travis-ci ubuntu VMs.

Please do test on your own machines with something like this script [4]:

source test_manylinux.sh

We have worried in the past about the reliability of OpenBLAS, but I
find these tests reassuring.

Are there any other tests of OpenBLAS that we should run to assure
ourselves that it is safe to use?

Matthew

[1] https://github.com/matthew-brett/manylinux-builds/issues/4#issue-143530908
[2] https://travis-ci.org/matthew-brett/manylinux-testing/builds/118780781
[3] I disabled a few pandas tests which were failing for reasons not
related to BLAS.  Some of the statsmodels test runs time out.
[4] https://gist.github.com/matthew-brett/2fd9d9a29e022c297634
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ATLAS build errors

2016-03-26 Thread Matthew Brett
Hi,

I'm workon on building manylinux wheels for numpy, and I ran into
unexpected problems with a numpy built against the ATLAS 3.8 binaries
supplied by CentOS 5.

I'm working on the manylinux docker container [1]

To get ATLAS, I'm doing `yum install atlas-devel` which gets the
default CentOS 5 ATLAS packages.

I then build numpy.  Local tests work fine, but when I test on travis,
I get these errors [2]:

==
ERROR: test_svd_build (test_regression.TestRegression)
--
Traceback (most recent call last):
  File 
"/home/travis/build/matthew-brett/manylinux-testing/venv/lib/python2.7/site-packages/numpy/linalg/tests/test_regression.py",
line 56, in test_svd_build
u, s, vh = linalg.svd(a)
  File 
"/home/travis/build/matthew-brett/manylinux-testing/venv/lib/python2.7/site-packages/numpy/linalg/linalg.py",
line 1359, in svd
u, s, vt = gufunc(a, signature=signature, extobj=extobj)
ValueError: On entry to DGESDD parameter number 12 had an illegal value

==
FAIL: test_lapack (test_build.TestF77Mismatch)
--
Traceback (most recent call last):
  File 
"/home/travis/build/matthew-brett/manylinux-testing/venv/lib/python2.7/site-packages/numpy/testing/decorators.py",
line 146, in skipper_func
return f(*args, **kwargs)
  File 
"/home/travis/build/matthew-brett/manylinux-testing/venv/lib/python2.7/site-packages/numpy/linalg/tests/test_build.py",
line 56, in test_lapack
information.""")
AssertionError: Both g77 and gfortran runtimes linked in lapack_lite !
This is likely to
cause random crashes and wrong results. See numpy INSTALL.txt for more
information.


Sure enough, scipy built the same way segfaults or fails to import (see [2]).

I get no errors for an openblas build.

Does anyone recognize these?   How should I modify the build to avoid them?

Cheers,

Matthew


[1] https://github.com/pypa/manylinux
[2] https://travis-ci.org/matthew-brett/manylinux-testing/jobs/118712090
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Windows wheels, built, but should we deploy?

2016-03-07 Thread Matthew Brett
On Sat, Mar 5, 2016 at 10:40 AM, Matthew Brett <matthew.br...@gmail.com> wrote:
> Hi,
>
> On Fri, Mar 4, 2016 at 8:40 PM, Nathaniel Smith <n...@pobox.com> wrote:
>> On Fri, Mar 4, 2016 at 7:30 PM,  <josef.p...@gmail.com> wrote:
>> [...]
>>> AFAIK, numpy doesn't provide access to BLAS/LAPACK. scipy does. statsmodels
>>> is linking to the installed BLAS/LAPACK in cython code through scipy. So far
>>> we haven't seen problems with different versions. I think scipy development
>>> works very well to isolate linalg library version specific parts from the
>>> user interface.
>>
>> Yeah, it should be invisible to users of both numpy and scipy which
>> BLAS/LAPACK is in use under the hood.
>
> My impression is that the general mood here is positive, so I plan to
> deploy these wheels to pypi on Monday, with the change to the pypi
> text.   Please do let me know if there are any strong objections.

Done:

(py35) PS C:\tmp> pip install numpy
Collecting numpy
  Downloading numpy-1.10.4-cp35-none-win32.whl (6.6MB)
100% || 6.6MB 34kB/s
Installing collected packages: numpy

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Windows wheels, built, but should we deploy?

2016-03-05 Thread Matthew Brett
Hi,

On Fri, Mar 4, 2016 at 8:40 PM, Nathaniel Smith  wrote:
> On Fri, Mar 4, 2016 at 7:30 PM,   wrote:
> [...]
>> AFAIK, numpy doesn't provide access to BLAS/LAPACK. scipy does. statsmodels
>> is linking to the installed BLAS/LAPACK in cython code through scipy. So far
>> we haven't seen problems with different versions. I think scipy development
>> works very well to isolate linalg library version specific parts from the
>> user interface.
>
> Yeah, it should be invisible to users of both numpy and scipy which
> BLAS/LAPACK is in use under the hood.

My impression is that the general mood here is positive, so I plan to
deploy these wheels to pypi on Monday, with the change to the pypi
text.   Please do let me know if there are any strong objections.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Windows wheels, built, but should we deploy?

2016-03-04 Thread Matthew Brett
On Fri, Mar 4, 2016 at 12:29 AM, David Cournapeau <courn...@gmail.com> wrote:
>
>
> On Fri, Mar 4, 2016 at 4:42 AM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> Hi,
>>
>> Summary:
>>
>> I propose that we upload Windows wheels to pypi.  The wheels are
>> likely to be stable and relatively easy to maintain, but will have
>> slower performance than other versions of numpy linked against faster
>> BLAS / LAPACK libraries.
>>
>> Background:
>>
>> There's a long discussion going on at issue github #5479 [1], where
>> the old problem of Windows wheels for numpy came up.
>>
>> For those of you not following this issue, the current situation for
>> community-built numpy Windows binaries is dire:
>>
>> * We have not so far provided windows wheels on pypi, so `pip install
>> numpy` on Windows will bring you a world of pain;
>> * Until recently we did provide .exe "superpack" installers on
>> sourceforge, but these became increasingly difficult to build and we
>> gave up building them as of the latest (1.10.4) release.
>>
>> Despite this, popularity of Windows wheels on pypi is high.   A few
>> weeks ago, Donald Stufft ran a query for the binary wheels most often
>> downloaded from pypi, for any platform [2] . The top five most
>> downloaded were (n_downloads, name):
>>
>> 6646,
>> numpy-1.10.4-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
>> 5445, cryptography-1.2.1-cp27-none-win_amd64.whl
>> 5243, matplotlib-1.4.0-cp34-none-win32.whl
>> 5241, scikit_learn-0.15.1-cp34-none-win32.whl
>> 4573, pandas-0.17.1-cp27-none-win_amd64.whl
>>
>> So a) the OSX numpy wheel is very popular and b) despite the fact that
>> we don't provide a numpy wheel for Windows, matplotlib, sckit_learn
>> and pandas, that depend on numpy, are the 3rd, 4th and 5th most
>> downloaded wheels as of a few weeks ago.
>>
>> So, there seems to be a large appetite for numpy wheels.
>>
>> Current proposal:
>>
>> I have now built numpy wheels, using the ATLAS blas / lapack library -
>> the build is automatic and reproducible [3].
>>
>> I chose ATLAS to build against, rather than, say OpenBLAS, because
>> we've had some significant worries in the past about the reliability
>> of OpenBLAS, and I thought it better to err on the side of
>> correctness.
>>
>> However, these builds are relatively slow for matrix multiply and
>> other linear algebra routines compared numpy built against OpenBLAS or
>> MKL (which we cannot use because of its license) [4].   In my very
>> crude array test of a dot product and matrix inversion, the ATLAS
>> wheels were 2-3 times slower than MKL.  Other benchmarks on Julia
>> found about the same result for ATLAS vs OpenBLAS on 32-bit bit, but a
>> much bigger difference on 64-bit (for an earlier version of ATLAS than
>> we are currently using) [5].
>>
>> So, our numpy wheels likely to be stable and give correct results, but
>> will be somewhat slow for linear algebra.
>
>
> I would not worry too much about this: at worst, this gives us back the
> situation where we were w/ so-called superpack, which have been successful
> in the past to spread numpy use on windows.
>
> My main worry is whether this locks us into ATLAS  for a long time because
> of package depending on numpy blas/lapack (scipy, scikit learn). I am not
> sure how much this is the case.

You mean the situation where other packages try to find the BLAS /
LAPACK library and link against that?   My impression was that neither
scipy or scikit-learn do that at the moment, but I'm happy to be
corrected.

You'd know better than me about this, but my understanding is that
BLAS / LAPACK has a standard interface that should allow code to run
the same way, regardless of which BLAS / LAPACK library it is linking
to.  So, even if another package is trying to link against the numpy
BLAS, swapping the numpy BLAS library shouldn't cause a problem
(unless the package is trying to link to ATLAS-specific stuff, which
seems a bit unlikely).

Is that right?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Windows wheels, built, but should we deploy?

2016-03-03 Thread Matthew Brett
Hi,

Summary:

I propose that we upload Windows wheels to pypi.  The wheels are
likely to be stable and relatively easy to maintain, but will have
slower performance than other versions of numpy linked against faster
BLAS / LAPACK libraries.

Background:

There's a long discussion going on at issue github #5479 [1], where
the old problem of Windows wheels for numpy came up.

For those of you not following this issue, the current situation for
community-built numpy Windows binaries is dire:

* We have not so far provided windows wheels on pypi, so `pip install
numpy` on Windows will bring you a world of pain;
* Until recently we did provide .exe "superpack" installers on
sourceforge, but these became increasingly difficult to build and we
gave up building them as of the latest (1.10.4) release.

Despite this, popularity of Windows wheels on pypi is high.   A few
weeks ago, Donald Stufft ran a query for the binary wheels most often
downloaded from pypi, for any platform [2] . The top five most
downloaded were (n_downloads, name):

6646, 
numpy-1.10.4-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
5445, cryptography-1.2.1-cp27-none-win_amd64.whl
5243, matplotlib-1.4.0-cp34-none-win32.whl
5241, scikit_learn-0.15.1-cp34-none-win32.whl
4573, pandas-0.17.1-cp27-none-win_amd64.whl

So a) the OSX numpy wheel is very popular and b) despite the fact that
we don't provide a numpy wheel for Windows, matplotlib, sckit_learn
and pandas, that depend on numpy, are the 3rd, 4th and 5th most
downloaded wheels as of a few weeks ago.

So, there seems to be a large appetite for numpy wheels.

Current proposal:

I have now built numpy wheels, using the ATLAS blas / lapack library -
the build is automatic and reproducible [3].

I chose ATLAS to build against, rather than, say OpenBLAS, because
we've had some significant worries in the past about the reliability
of OpenBLAS, and I thought it better to err on the side of
correctness.

However, these builds are relatively slow for matrix multiply and
other linear algebra routines compared numpy built against OpenBLAS or
MKL (which we cannot use because of its license) [4].   In my very
crude array test of a dot product and matrix inversion, the ATLAS
wheels were 2-3 times slower than MKL.  Other benchmarks on Julia
found about the same result for ATLAS vs OpenBLAS on 32-bit bit, but a
much bigger difference on 64-bit (for an earlier version of ATLAS than
we are currently using) [5].

So, our numpy wheels likely to be stable and give correct results, but
will be somewhat slow for linear algebra.

I propose that we upload these ATLAS wheels to pypi.  The upside is
that this gives our Windows users a much better experience with pip,
and allows other developers to build Windows wheels that depend on
numpy.  The downside is that these will not be optimized for
performance on modern processors.  In order to signal that, I propose
adding the following text to the numpy pypi front page:

```
All numpy wheels distributed from pypi are BSD licensed.

Windows wheels are linked against the ATLAS BLAS / LAPACK library,
restricted to SSE2 instructions, so may not give optimal linear
algebra performance for your machine. See
http://docs.scipy.org/doc/numpy/user/install.html for alternatives.
```

In a way this is very similar to our previous situation, in that the
superpack installers also used ATLAS - in fact an older version of
ATLAS.

Once we are up and running with numpy wheels, we can consider whether
we should switch to other BLAS libraries, such as OpenBLAS or BLIS
(see [6]).

I'm posting here hoping for your feedback...

Cheers,

Matthew


[1] https://github.com/numpy/numpy/issues/5479
[2] https://gist.github.com/dstufft/1dda9a9f87ee7121e0ee
[3] https://ci.appveyor.com/project/matthew-brett/np-wheel-builder
[4] http://mingwpy.github.io/blas_lapack.html#intel-math-kernel-library
[5] https://github.com/numpy/numpy/issues/5479#issuecomment-185033668
[6] https://github.com/numpy/numpy/issues/7372
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Multi-distribution Linux wheels - please test

2016-02-17 Thread Matthew Brett
Hi,

On Tue, Feb 9, 2016 at 12:01 PM, Freddy Rietdijk
 wrote:
> On Nix we also had trouble with OpenBLAS 0.2.15. Version 0.2.14 did not
> cause any segmentation faults so we reverted to that version.
> https://github.com/scipy/scipy/issues/5620
>
> (hopefully this time the e-mail gets through)

In hope, I tried building with 0.2.12 and 0.2.14, but 0.2.12 gave me
an extra test failure, and 0.2.14 gave the same test failure as
0.2.15..

Sadly...

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Windows wheels for testing

2016-02-16 Thread Matthew Brett
On Sat, Feb 13, 2016 at 9:55 AM, Jonathan Helmus <jjhel...@gmail.com> wrote:
>
>
> On 2/12/16 10:23 PM, Matthew Brett wrote:
>>
>> On Fri, Feb 12, 2016 at 8:18 PM, R Schumacher <r...@blue-cove.com> wrote:
>>>
>>> At 03:45 PM 2/12/2016, you wrote:
>>>>
>>>> PS C:\tmp> c:\Python35\python -m venv np-testing
>>>> PS C:\tmp> .\np-testing\Scripts\Activate.ps1
>>>> (np-testing) PS C:\tmp> pip install -f
>>>> https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds numpy nose
>>>
>>>
>>> C:\Python34\Scripts>pip install  "D:\Python
>>> distros\numpy-1.10.4-cp34-none-win_amd64.whl"
>>> Unpacking d:\python distros\numpy-1.10.4-cp34-none-win_amd64.whl
>>> Installing collected packages: numpy
>>> Successfully installed numpy
>>> Cleaning up...
>>>
>>> C:\Python34\Scripts>..\python
>>> Python 3.4.2 (v3.4.2:ab2c023a9432, Oct  6 2014, 22:16:31) [MSC v.1600 64
>>> bit
>>> (AMD64)] on win32
>>> Type "help", "copyright", "credits" or "license" for more information.
>>>>>>
>>>>>> import numpy
>>>>>> numpy.test()
>>>
>>> Running unit tests for numpy
>>> NumPy version 1.10.4
>>> NumPy relaxed strides checking option: False
>>> NumPy is installed in C:\Python34\lib\site-packages\numpy
>>> Python version 3.4.2 (v3.4.2:ab2c023a9432, Oct  6 2014, 22:16:31) [MSC
>>> v.1600 64 bit (AMD64)]
>>> nose version 1.3.7
>>>
>>> ...FS...
>>>
>>> .S..
>>>
>>> ..C:\Python34\lib\unittest\case.
>>> py:162: DeprecationWarning: using a non-integer number instead of an
>>> integer
>>> will result in an error in the future
>>>callable_obj(*args, **kwargs)
>>> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
>>> non-integer number instead of an integer will
>>> result in an error in the future
>>>callable_obj(*args, **kwargs)
>>> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
>>> non-integer number instead of an integer will result i
>>> n an error in the future
>>>callable_obj(*args, **kwargs)
>>>
>>> ...S
>>>
>>> 
>>>
>>> ..C:\Python34\lib\unittest\case.py:162:
>>> Deprecat
>>> ionWarning: using a non-integer number instead of an integer will result
>>> in
>>> an error in the future
>>>callable_obj(*args, **kwargs)
>>> ..C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
>>>

Re: [Numpy-discussion] Fwd: Windows wheels for testing

2016-02-12 Thread Matthew Brett
On Fri, Feb 12, 2016 at 8:18 PM, R Schumacher  wrote:
> At 03:45 PM 2/12/2016, you wrote:
>>
>> PS C:\tmp> c:\Python35\python -m venv np-testing
>> PS C:\tmp> .\np-testing\Scripts\Activate.ps1
>> (np-testing) PS C:\tmp> pip install -f
>> https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds numpy nose
>
>
> C:\Python34\Scripts>pip install  "D:\Python
> distros\numpy-1.10.4-cp34-none-win_amd64.whl"
> Unpacking d:\python distros\numpy-1.10.4-cp34-none-win_amd64.whl
> Installing collected packages: numpy
> Successfully installed numpy
> Cleaning up...
>
> C:\Python34\Scripts>..\python
> Python 3.4.2 (v3.4.2:ab2c023a9432, Oct  6 2014, 22:16:31) [MSC v.1600 64 bit
> (AMD64)] on win32
> Type "help", "copyright", "credits" or "license" for more information.
 import numpy
 numpy.test()
> Running unit tests for numpy
> NumPy version 1.10.4
> NumPy relaxed strides checking option: False
> NumPy is installed in C:\Python34\lib\site-packages\numpy
> Python version 3.4.2 (v3.4.2:ab2c023a9432, Oct  6 2014, 22:16:31) [MSC
> v.1600 64 bit (AMD64)]
> nose version 1.3.7
> ...FS...
> .S..
> ..C:\Python34\lib\unittest\case.
> py:162: DeprecationWarning: using a non-integer number instead of an integer
> will result in an error in the future
>   callable_obj(*args, **kwargs)
> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
> non-integer number instead of an integer will
> result in an error in the future
>   callable_obj(*args, **kwargs)
> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
> non-integer number instead of an integer will result i
> n an error in the future
>   callable_obj(*args, **kwargs)
> ...S
> 
> ..C:\Python34\lib\unittest\case.py:162:
> Deprecat
> ionWarning: using a non-integer number instead of an integer will result in
> an error in the future
>   callable_obj(*args, **kwargs)
> ..C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
> non-integer number instead of an integer will result
>  in an error in the future
>   callable_obj(*args, **kwargs)
> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
> non-integer number instead of an integer will result i
> n an error in the future
>   callable_obj(*args, **kwargs)
> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
> non-integer number instead of an integer will result i
> n an error in the future
>   callable_obj(*args, **kwargs)
> C:\Python34\lib\unittest\case.py:162: DeprecationWarning: using a
> non-integer number instead of an integer will result i
> n an error in the future
>   callable_obj(*args, **kwargs)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...K.C:\Python34\lib
> \site-packages\numpy\ma\core.py:989: RuntimeWarning: invalid value
> encountered in multiply
>   masked_da = umath.multiply(m, da)
> C:\Python34\lib\site-packages\numpy\ma\core.py:989: RuntimeWarning: invalid
> value encountered in multiply
>   masked_da = 

[Numpy-discussion] Windows wheels for testing

2016-02-12 Thread Matthew Brett
Hi,

We're talking about putting up Windows wheels for numpy on pypi -
here: https://github.com/numpy/numpy/issues/5479

I've built some wheels that might be suitable - available here:
http://nipy.bic.berkeley.edu/scipy_installers/atlas_builds/

I'd be very grateful if y'all would test these.  They should install
with something like:

pip install -f https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds numpy

This should work for Pythons 2.7, 3.4, 3.5, both 32 and 64-bit.

Any feedback would be very useful,

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Fwd: Windows wheels for testing

2016-02-12 Thread Matthew Brett
On Fri, Feb 12, 2016 at 3:15 PM, R Schumacher  wrote:
> At 03:06 PM 2/12/2016, you wrote:
>
>> Any feedback would be very useful,
>
>
> Sure, here's a little:
>
> C:\Python34\Scripts>pip install -f
> https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds numpy
> Requirement already satisfied (use --upgrade to upgrade): numpy in
> c:\python34\lib\site-packages
> Cleaning up...

Thanks - yes - I should maybe have said that testing in a virtual
environment is your best bet.

Here's me testing from Powershell on a 32-bit Windows and on Pythons
3.5 and 2.7:

PS C:\tmp> c:\Python35\python -m venv np-testing
PS C:\tmp> .\np-testing\Scripts\Activate.ps1
(np-testing) PS C:\tmp> pip install -f
https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds numpy nose
Collecting numpy
  Downloading 
https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds/numpy-1.10.4-cp35-none-win32.whl
(6.6MB)
100% || 6.6MB 34kB/s
Collecting nose
  Using cached nose-1.3.7-py3-none-any.whl
Installing collected packages: numpy, nose
Successfully installed nose-1.3.7 numpy-1.10.4
You are using pip version 7.1.2, however version 8.0.2 is available.
You should consider upgrading via the 'python -m pip install --upgrade
pip' command.
(np-testing) PS C:\tmp> python -c 'import numpy; numpy.test()'
Running unit tests for numpy
NumPy version 1.10.4
NumPy relaxed strides checking option: False
NumPy is installed in C:\tmp\np-testing\lib\site-packages\numpy
Python version 3.5.0 (v3.5.0:374f501f4567, Sep 13 2015, 02:16:59) [MSC
v.1900 32 bit (Intel)]
nose version 1.3.7
. [test output]

Now on 2.7

(np-testing) PS C:\tmp> deactivate
PS C:\tmp> C:\Python27\Scripts\pip.exe install virtualenv
[...] output snipped
PS C:\tmp> C:\Python27\Scripts\virtualenv.exe np27-testing
Ignoring indexes: https://pypi.python.org/simple
Collecting pip
Collecting wheel
Collecting setuptools
Installing collected packages: pip, wheel, setuptools
Successfully installed pip-7.1.2 setuptools-18.5 wheel-0.26.0
PS C:\tmp> .\np27-testing\Scripts\activate.ps1
(np27-testing) PS C:\tmp> pip install -f
https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds numpy nose
Collecting numpy
c:\tmp\np27-testing\lib\site-packages\pip\_vendor\requests\packages\urllib3\util\ssl_.py:90:
InsecurePlatformWarning: A
true SSLContext object is not available. This prevents urllib3 from
configuring SSL appropriately and may cause certain
SSL connections to fail. For more information, see
https://urllib3.readthedocs.org/en/latest/security.html#insecureplatf
ormwarning.
  InsecurePlatformWarning
  Downloading 
https://nipy.bic.berkeley.edu/scipy_installers/atlas_builds/numpy-1.10.4-cp27-none-win32.whl
(6.4MB)
100% || 6.4MB 35kB/s
Collecting nose
c:\tmp\np27-testing\lib\site-packages\pip\_vendor\requests\packages\urllib3\util\ssl_.py:90:
InsecurePlatformWarning: A
true SSLContext object is not available. This prevents urllib3 from
configuring SSL appropriately and may cause certain
SSL connections to fail. For more information, see
https://urllib3.readthedocs.org/en/latest/security.html#insecureplatf
ormwarning.
  InsecurePlatformWarning
  Using cached nose-1.3.7-py2-none-any.whl
Installing collected packages: numpy, nose
Successfully installed nose-1.3.7 numpy-1.10.4
You are using pip version 7.1.2, however version 8.0.2 is available.
You should consider upgrading via the 'python -m pip install --upgrade
pip' command.
(np27-testing) PS C:\tmp> python -c 'import numpy; numpy.test()'
Running unit tests for numpy
NumPy version 1.10.4
NumPy relaxed strides checking option: False
NumPy is installed in c:\tmp\np27-testing\lib\site-packages\numpy
Python version 2.7.8 (default, Jun 30 2014, 16:03:49) [MSC v.1500 32
bit (Intel)]
nose version 1.3.7
.. [etc]

Thanks a lot for testing,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Hook in __init__.py to let distributors patch numpy

2016-02-11 Thread Matthew Brett
Hi,

Over at https://github.com/numpy/numpy/issues/5479 we're discussing
Windows wheels.

On thing that we would like to be able to ship Windows wheels, is to
be able to put some custom checks into numpy when you build the
wheels.

Specifically, for Windows, we're building on top of ATLAS BLAS /
LAPACK, and we need to check that the system on which the wheel is
running, has SSE2 instructions, otherwise we know ATLAS will crash
(almost everybody does have SSE2 these days).

The way I propose we do that, is this patch here:

https://github.com/numpy/numpy/pull/7231

diff --git a/numpy/__init__.py b/numpy/__init__.py
index 0fcd509..ba3ba16 100644
--- a/numpy/__init__.py
+++ b/numpy/__init__.py
@@ -190,6 +190,12 @@ def pkgload(*packages, **options):
 test = testing.nosetester._numpy_tester().test
 bench = testing.nosetester._numpy_tester().bench

+# Allow platform-specific build to intervene in numpy init
+try:
+from . import _distributor_init
+except ImportError:
+pass
+
 from . import core
 from .core import *
 from . import compat

So, numpy __init__.py looks for a module `_distributor_init`, in which
the distributor might have put custom code to do any checks and
initialization needed for the particular platform.  We don't by
default ship a `_distributor_init.py` but leave it up to packagers to
generate this when building binaries.

Does that sound like a sensible approach to y'all?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Multi-distribution Linux wheels - please test

2016-02-09 Thread Matthew Brett
On Tue, Feb 9, 2016 at 11:37 AM, Julian Taylor
<jtaylor.deb...@googlemail.com> wrote:
> On 09.02.2016 04:59, Nathaniel Smith wrote:
>> On Mon, Feb 8, 2016 at 6:07 PM, Nathaniel Smith <n...@pobox.com> wrote:
>>> On Mon, Feb 8, 2016 at 6:04 PM, Matthew Brett <matthew.br...@gmail.com> 
>>> wrote:
>>>> On Mon, Feb 8, 2016 at 5:26 PM, Nathaniel Smith <n...@pobox.com> wrote:
>>>>> On Mon, Feb 8, 2016 at 4:37 PM, Matthew Brett <matthew.br...@gmail.com> 
>>>>> wrote:
>>>>> [...]
>>>>>> I can't replicate the segfault with manylinux wheels and scipy.  On
>>>>>> the other hand, I get a new test error for numpy from manylinux, scipy
>>>>>> from manylinux, like this:
>>>>>>
>>>>>> $ python -c 'import scipy.linalg; scipy.linalg.test()'
>>>>>>
>>>>>> ==
>>>>>> FAIL: test_decomp.test_eigh('general ', 6, 'F', True, False, False, (2, 
>>>>>> 4))
>>>>>> --
>>>>>> Traceback (most recent call last):
>>>>>>   File "/usr/local/lib/python2.7/dist-packages/nose/case.py", line
>>>>>> 197, in runTest
>>>>>> self.test(*self.arg)
>>>>>>   File 
>>>>>> "/usr/local/lib/python2.7/dist-packages/scipy/linalg/tests/test_decomp.py",
>>>>>> line 658, in eigenhproblem_general
>>>>>> assert_array_almost_equal(diag2_, ones(diag2_.shape[0]), 
>>>>>> DIGITS[dtype])
>>>>>>   File "/usr/local/lib/python2.7/dist-packages/numpy/testing/utils.py",
>>>>>> line 892, in assert_array_almost_equal
>>>>>> precision=decimal)
>>>>>>   File "/usr/local/lib/python2.7/dist-packages/numpy/testing/utils.py",
>>>>>> line 713, in assert_array_compare
>>>>>> raise AssertionError(msg)
>>>>>> AssertionError:
>>>>>> Arrays are not almost equal to 4 decimals
>>>>>>
>>>>>> (mismatch 100.0%)
>>>>>>  x: array([ 0.,  0.,  0.], dtype=float32)
>>>>>>  y: array([ 1.,  1.,  1.])
>>>>>>
>>>>>> --
>>>>>> Ran 1507 tests in 14.928s
>>>>>>
>>>>>> FAILED (KNOWNFAIL=4, SKIP=1, failures=1)
>>>>>>
>>>>>> This is a very odd error, which we don't get when running over a numpy
>>>>>> installed from source, linked to ATLAS, and doesn't happen when
>>>>>> running the tests via:
>>>>>>
>>>>>> nosetests /usr/local/lib/python2.7/dist-packages/scipy/linalg
>>>>>>
>>>>>> So, something about the copy of numpy (linked to openblas) is
>>>>>> affecting the results of scipy (also linked to openblas), and only
>>>>>> with a particular environment / test order.
>>>>>>
>>>>>> If you'd like to try and see whether y'all can do a better job of
>>>>>> debugging than me:
>>>>>>
>>>>>> # Run this script inside a docker container started with this 
>>>>>> incantation:
>>>>>> # docker run -ti --rm ubuntu:12.04 /bin/bash
>>>>>> apt-get update
>>>>>> apt-get install -y python curl
>>>>>> apt-get install libpython2.7  # this won't be necessary with next
>>>>>> iteration of manylinux wheel builds
>>>>>> curl -LO https://bootstrap.pypa.io/get-pip.py
>>>>>> python get-pip.py
>>>>>> pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy nose
>>>>>> python -c 'import scipy.linalg; scipy.linalg.test()'
>>>>>
>>>>> I just tried this and on my laptop it completed without error.
>>>>>
>>>>> Best guess is that we're dealing with some memory corruption bug
>>>>> inside openblas, so it's getting perturbed by things like exactly what
>>>>> other calls to openblas have happened (which is different depending on
>>>>> whether numpy is linked to openblas), and which core type openblas has
>>>>> detected.
>>>>>
>>>>> On my laptop

Re: [Numpy-discussion] Linking other libm-Implementation

2016-02-09 Thread Matthew Brett
On Tue, Feb 9, 2016 at 7:06 AM, Daπid  wrote:
> On 8 February 2016 at 18:36, Nathaniel Smith  wrote:
>> I would be highly suspicious that this speed comes at the expense of
>> accuracy... My impression is that there's a lot of room to make
>> speed/accuracy tradeoffs in these functions, and modern glibc's libm has
>> seen a fair amount of scrutiny by people who have access to the same code
>> that openlibm is based off of. But then again, maybe not :-).
>
>
> I did some digging, and I found this:
>
> http://julia-programming-language.2336112.n4.nabble.com/Is-the-accuracy-of-Julia-s-elementary-functions-exp-sin-known-td32736.html
>
> In short: according to their devs, most openlibm functions are
> accurate to less than 1ulp, while GNU libm is rounded to closest
> float.

So GNU libm has max error <= 0.5 ULP, openlibm has <= 1 ULP, and OSX
is (almost always) somewhere in-between.

So, is <= 1 ULP good enough?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Multi-distribution Linux wheels - please test

2016-02-09 Thread Matthew Brett
On Mon, Feb 8, 2016 at 7:59 PM, Nathaniel Smith <n...@pobox.com> wrote:
> On Mon, Feb 8, 2016 at 6:07 PM, Nathaniel Smith <n...@pobox.com> wrote:
>> On Mon, Feb 8, 2016 at 6:04 PM, Matthew Brett <matthew.br...@gmail.com> 
>> wrote:
>>> On Mon, Feb 8, 2016 at 5:26 PM, Nathaniel Smith <n...@pobox.com> wrote:
>>>> On Mon, Feb 8, 2016 at 4:37 PM, Matthew Brett <matthew.br...@gmail.com> 
>>>> wrote:
>>>> [...]
>>>>> I can't replicate the segfault with manylinux wheels and scipy.  On
>>>>> the other hand, I get a new test error for numpy from manylinux, scipy
>>>>> from manylinux, like this:
>>>>>
>>>>> $ python -c 'import scipy.linalg; scipy.linalg.test()'
>>>>>
>>>>> ==
>>>>> FAIL: test_decomp.test_eigh('general ', 6, 'F', True, False, False, (2, 
>>>>> 4))
>>>>> --
>>>>> Traceback (most recent call last):
>>>>>   File "/usr/local/lib/python2.7/dist-packages/nose/case.py", line
>>>>> 197, in runTest
>>>>> self.test(*self.arg)
>>>>>   File 
>>>>> "/usr/local/lib/python2.7/dist-packages/scipy/linalg/tests/test_decomp.py",
>>>>> line 658, in eigenhproblem_general
>>>>> assert_array_almost_equal(diag2_, ones(diag2_.shape[0]), 
>>>>> DIGITS[dtype])
>>>>>   File "/usr/local/lib/python2.7/dist-packages/numpy/testing/utils.py",
>>>>> line 892, in assert_array_almost_equal
>>>>> precision=decimal)
>>>>>   File "/usr/local/lib/python2.7/dist-packages/numpy/testing/utils.py",
>>>>> line 713, in assert_array_compare
>>>>> raise AssertionError(msg)
>>>>> AssertionError:
>>>>> Arrays are not almost equal to 4 decimals
>>>>>
>>>>> (mismatch 100.0%)
>>>>>  x: array([ 0.,  0.,  0.], dtype=float32)
>>>>>  y: array([ 1.,  1.,  1.])
>>>>>
>>>>> --
>>>>> Ran 1507 tests in 14.928s
>>>>>
>>>>> FAILED (KNOWNFAIL=4, SKIP=1, failures=1)
>>>>>
>>>>> This is a very odd error, which we don't get when running over a numpy
>>>>> installed from source, linked to ATLAS, and doesn't happen when
>>>>> running the tests via:
>>>>>
>>>>> nosetests /usr/local/lib/python2.7/dist-packages/scipy/linalg
>>>>>
>>>>> So, something about the copy of numpy (linked to openblas) is
>>>>> affecting the results of scipy (also linked to openblas), and only
>>>>> with a particular environment / test order.
>>>>>
>>>>> If you'd like to try and see whether y'all can do a better job of
>>>>> debugging than me:
>>>>>
>>>>> # Run this script inside a docker container started with this incantation:
>>>>> # docker run -ti --rm ubuntu:12.04 /bin/bash
>>>>> apt-get update
>>>>> apt-get install -y python curl
>>>>> apt-get install libpython2.7  # this won't be necessary with next
>>>>> iteration of manylinux wheel builds
>>>>> curl -LO https://bootstrap.pypa.io/get-pip.py
>>>>> python get-pip.py
>>>>> pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy nose
>>>>> python -c 'import scipy.linalg; scipy.linalg.test()'
>>>>
>>>> I just tried this and on my laptop it completed without error.
>>>>
>>>> Best guess is that we're dealing with some memory corruption bug
>>>> inside openblas, so it's getting perturbed by things like exactly what
>>>> other calls to openblas have happened (which is different depending on
>>>> whether numpy is linked to openblas), and which core type openblas has
>>>> detected.
>>>>
>>>> On my laptop, which *doesn't* show the problem, running with
>>>> OPENBLAS_VERBOSE=2 says "Core: Haswell".
>>>>
>>>> Guess the next step is checking what core type the failing machines
>>>> use, and running valgrind... anyone have a good valgrind suppressions
>>>> file?
>>>
>>> My machine (which does give the failure) gives
>>>
>>> Core: Core2
>>>
>>> with OPENBLAS_VERBOSE=2
>>
>> Yep, that allows me to reproduce it:
>>
>> root@f7153f0cc841:/# OPENBLAS_VERBOSE=2 OPENBLAS_CORETYPE=Core2 python
>> -c 'import scipy.linalg; scipy.linalg.test()'
>> Core: Core2
>> [...]
>> ==
>> FAIL: test_decomp.test_eigh('general ', 6, 'F', True, False, False, (2, 4))
>> --
>> [...]
>>
>> So this is indeed sounding like an OpenBLAS issue... next stop
>> valgrind, I guess :-/
>
> Here's the valgrind output:
>   https://gist.github.com/njsmith/577d028e79f0a80d2797
>
> There's a lot of it, but no smoking guns have jumped out at me :-/

Could you send me instructions on replicating the valgrind run, I'll
run on on the actual Core2 machine...

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Multi-distribution Linux wheels - please test

2016-02-08 Thread Matthew Brett
On Mon, Feb 8, 2016 at 5:26 PM, Nathaniel Smith <n...@pobox.com> wrote:
> On Mon, Feb 8, 2016 at 4:37 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
> [...]
>> I can't replicate the segfault with manylinux wheels and scipy.  On
>> the other hand, I get a new test error for numpy from manylinux, scipy
>> from manylinux, like this:
>>
>> $ python -c 'import scipy.linalg; scipy.linalg.test()'
>>
>> ==
>> FAIL: test_decomp.test_eigh('general ', 6, 'F', True, False, False, (2, 4))
>> --
>> Traceback (most recent call last):
>>   File "/usr/local/lib/python2.7/dist-packages/nose/case.py", line
>> 197, in runTest
>> self.test(*self.arg)
>>   File 
>> "/usr/local/lib/python2.7/dist-packages/scipy/linalg/tests/test_decomp.py",
>> line 658, in eigenhproblem_general
>> assert_array_almost_equal(diag2_, ones(diag2_.shape[0]), DIGITS[dtype])
>>   File "/usr/local/lib/python2.7/dist-packages/numpy/testing/utils.py",
>> line 892, in assert_array_almost_equal
>> precision=decimal)
>>   File "/usr/local/lib/python2.7/dist-packages/numpy/testing/utils.py",
>> line 713, in assert_array_compare
>> raise AssertionError(msg)
>> AssertionError:
>> Arrays are not almost equal to 4 decimals
>>
>> (mismatch 100.0%)
>>  x: array([ 0.,  0.,  0.], dtype=float32)
>>  y: array([ 1.,  1.,  1.])
>>
>> --
>> Ran 1507 tests in 14.928s
>>
>> FAILED (KNOWNFAIL=4, SKIP=1, failures=1)
>>
>> This is a very odd error, which we don't get when running over a numpy
>> installed from source, linked to ATLAS, and doesn't happen when
>> running the tests via:
>>
>> nosetests /usr/local/lib/python2.7/dist-packages/scipy/linalg
>>
>> So, something about the copy of numpy (linked to openblas) is
>> affecting the results of scipy (also linked to openblas), and only
>> with a particular environment / test order.
>>
>> If you'd like to try and see whether y'all can do a better job of
>> debugging than me:
>>
>> # Run this script inside a docker container started with this incantation:
>> # docker run -ti --rm ubuntu:12.04 /bin/bash
>> apt-get update
>> apt-get install -y python curl
>> apt-get install libpython2.7  # this won't be necessary with next
>> iteration of manylinux wheel builds
>> curl -LO https://bootstrap.pypa.io/get-pip.py
>> python get-pip.py
>> pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy nose
>> python -c 'import scipy.linalg; scipy.linalg.test()'
>
> I just tried this and on my laptop it completed without error.
>
> Best guess is that we're dealing with some memory corruption bug
> inside openblas, so it's getting perturbed by things like exactly what
> other calls to openblas have happened (which is different depending on
> whether numpy is linked to openblas), and which core type openblas has
> detected.
>
> On my laptop, which *doesn't* show the problem, running with
> OPENBLAS_VERBOSE=2 says "Core: Haswell".
>
> Guess the next step is checking what core type the failing machines
> use, and running valgrind... anyone have a good valgrind suppressions
> file?

My machine (which does give the failure) gives

Core: Core2

with OPENBLAS_VERBOSE=2

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] Multi-distribution Linux wheels - please test

2016-02-08 Thread Matthew Brett
On Mon, Feb 8, 2016 at 10:41 AM, Evgeni Burovski
 wrote:
>
>> numpy.show_config() shows the places that numpy found the libraries at
>> build time.  In the case of the manylinux wheel builds, I put openblas
>> at /usr/local , but the place the wheel should be loading openblas
>> from is /.libs.  For example, I think you'll
>> find that the numpy tests will still pass if you remove any openblas
>> installation at /usr/local .
>
> Confirmed: I do not have openblas in that location, and tests sort of pass
> (see a parallel email in this thread).
>
> By the way, is there a chance you could use a more specific location ---
> "What does your numpy.show_config() show?" is a question we often ask when
> receiving bug reports; having a marker location could save us an iteration
> when dealing with those when your wheels are common.

That's a good idea.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] Multi-distribution Linux wheels - please test

2016-02-08 Thread Matthew Brett
Hi,

On Mon, Feb 8, 2016 at 3:29 AM, Daπid <davidmen...@gmail.com> wrote:
>
> On 6 February 2016 at 21:26, Matthew Brett <matthew.br...@gmail.com> wrote:
>>
>>
>> pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy
>> python -c 'import numpy; numpy.test()'
>> python -c 'import scipy; scipy.test()'
>
>
>
> All the tests pass on my Fedora 23 with Python 2.7, but it seems to be
> linking to the system openblas:
>
> numpy.show_config()
> lapack_opt_info:
> libraries = ['openblas']
> library_dirs = ['/usr/local/lib']
> define_macros = [('HAVE_CBLAS', None)]
> language = c

numpy.show_config() shows the places that numpy found the libraries at
build time.  In the case of the manylinux wheel builds, I put openblas
at /usr/local , but the place the wheel should be loading openblas
from is /.libs.  For example, I think you'll
find that the numpy tests will still pass if you remove any openblas
installation at /usr/local .

Thanks for testing by the way,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Multi-distribution Linux wheels - please test

2016-02-08 Thread Matthew Brett
On Mon, Feb 8, 2016 at 3:57 AM, Evgeni Burovski
<evgeny.burovs...@gmail.com> wrote:
> -- Forwarded message --
> From: Evgeni Burovski <evgeny.burovs...@gmail.com>
> Date: Mon, Feb 8, 2016 at 11:56 AM
> Subject: Re: [Numpy-discussion] Multi-distribution Linux wheels - please test
> To: Discussion of Numerical Python <numpy-discussion@scipy.org>
>
>
> On Sat, Feb 6, 2016 at 8:26 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
>> Hi,
>>
>> As some of you may have seen, Robert McGibbon and Nathaniel have just
>> guided a PEP for multi-distribution Linux wheels past the approval
>> process over on distutils-sig:
>>
>> https://www.python.org/dev/peps/pep-0513/
>>
>> The PEP includes a docker image on which y'all can build wheels which
>> match the PEP:
>>
>> https://quay.io/repository/manylinux/manylinux
>>
>> Now we're at the stage where we need stress-testing of the built
>> wheels to find any problems we hadn't thought of.
>>
>> I've built numpy and scipy wheels here:
>>
>> https://nipy.bic.berkeley.edu/manylinux/
>>
>> So, if you have a Linux distribution handy, we would love to hear from
>> you about the results of testing these guys, maybe on the lines of:
>>
>> pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy
>> python -c 'import numpy; numpy.test()'
>> python -c 'import scipy; scipy.test()'
>>
>> These manylinux wheels should soon be available on pypi, and soon
>> after, installable with latest pip, so we would like to fix as many
>> problems as possible before going live.
>>
>> Cheers,
>>
>> Matthew
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@scipy.org
>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
>
> Hi,
>
> Bog-standard Ubuntu 12.04, fresh virtualenv:
>
> Python 2.7.3 (default, Jun 22 2015, 19:33:41)
> [GCC 4.6.3] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
>>>> import numpy
>>>> numpy.__version__
> '1.10.4'
>>>> numpy.test()
> Running unit tests for numpy
> NumPy version 1.10.4
> NumPy relaxed strides checking option: False
> NumPy is installed in
> /home/br/virtualenvs/manylinux/local/lib/python2.7/site-packages/numpy
> Python version 2.7.3 (default, Jun 22 2015, 19:33:41) [GCC 4.6.3]
> nose version 1.3.7
>
> 
>
> ==
> ERROR: test_multiarray.TestNewBufferProtocol.test_relaxed_strides
> --
> Traceback (most recent call last):
>   File 
> "/home/br/virtualenvs/manylinux/local/lib/python2.7/site-packages/nose/case.py",
> line 197, in runTest
> self.test(*self.arg)
>   File 
> "/home/br/virtualenvs/manylinux/local/lib/python2.7/site-packages/numpy/core/tests/test_multiarray.py",
> line 5366, in test_relaxed_strides
> fd.write(c.data)
> TypeError: 'buffer' does not have the buffer interface
>
> --
>
>
> * Scipy tests pass with one error in TestNanFuncs, but the interpreter
> crashes immediately afterwards.
>
>
> Same machine, python 3.5: both numpy and scipy tests pass.

Ouch - great that you found these, I'll take a look,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-08 Thread Matthew Brett
Hi Julian,

On Mon, Feb 8, 2016 at 8:40 AM, Julian Taylor
<jtaylor.deb...@googlemail.com> wrote:
> On 02/08/2016 05:23 PM, Daπid wrote:
>>
>> On 8 February 2016 at 16:19, Olivier Grisel <olivier.gri...@ensta.org
>> <mailto:olivier.gri...@ensta.org>> wrote:
>>
>>
>>
>> OPENBLAS_CORETYPE=Nehalem python3 -c "import numpy as np; from scipy
>> import linalg; linalg.eigh(np.random.randn(200, 200))"
>>
>> So this is an issue with the architecture detection of OpenBLAS.
>>
>>
>> I am seeing the same problem on a native Linux box, with Ivy Bridge
>> processor (i5-3317U). According to your script, both my native openblas
>> and the one in the wheel recognises my CPU as Sandybridge, but the wheel
>> produces a segmentation fault. Setting the architecture to Nehalem works.
>>
>
> more likely that is a bug the kernel of openblas instead of its cpu
> detection.
> The cpuinfo of Oliver indicates its at least a sandy bridge, and ivy
> bridge is be sandy bridge compatible.
> Is an up to date version of openblas used?

I used the latest release, v0.2.15:
https://github.com/matthew-brett/manylinux-builds/blob/master/build_openblas.sh#L5

Is there a later version that we should try?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Multi-distribution Linux wheels - please test

2016-02-08 Thread Matthew Brett
On Mon, Feb 8, 2016 at 10:23 AM, Matthew Brett <matthew.br...@gmail.com> wrote:
> On Mon, Feb 8, 2016 at 3:57 AM, Evgeni Burovski
> <evgeny.burovs...@gmail.com> wrote:
>> -- Forwarded message --
>> From: Evgeni Burovski <evgeny.burovs...@gmail.com>
>> Date: Mon, Feb 8, 2016 at 11:56 AM
>> Subject: Re: [Numpy-discussion] Multi-distribution Linux wheels - please test
>> To: Discussion of Numerical Python <numpy-discussion@scipy.org>
>>
>>
>> On Sat, Feb 6, 2016 at 8:26 PM, Matthew Brett <matthew.br...@gmail.com> 
>> wrote:
>>> Hi,
>>>
>>> As some of you may have seen, Robert McGibbon and Nathaniel have just
>>> guided a PEP for multi-distribution Linux wheels past the approval
>>> process over on distutils-sig:
>>>
>>> https://www.python.org/dev/peps/pep-0513/
>>>
>>> The PEP includes a docker image on which y'all can build wheels which
>>> match the PEP:
>>>
>>> https://quay.io/repository/manylinux/manylinux
>>>
>>> Now we're at the stage where we need stress-testing of the built
>>> wheels to find any problems we hadn't thought of.
>>>
>>> I've built numpy and scipy wheels here:
>>>
>>> https://nipy.bic.berkeley.edu/manylinux/
>>>
>>> So, if you have a Linux distribution handy, we would love to hear from
>>> you about the results of testing these guys, maybe on the lines of:
>>>
>>> pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy
>>> python -c 'import numpy; numpy.test()'
>>> python -c 'import scipy; scipy.test()'
>>>
>>> These manylinux wheels should soon be available on pypi, and soon
>>> after, installable with latest pip, so we would like to fix as many
>>> problems as possible before going live.
>>>
>>> Cheers,
>>>
>>> Matthew
>>> ___
>>> NumPy-Discussion mailing list
>>> NumPy-Discussion@scipy.org
>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>>
>>
>>
>> Hi,
>>
>> Bog-standard Ubuntu 12.04, fresh virtualenv:
>>
>> Python 2.7.3 (default, Jun 22 2015, 19:33:41)
>> [GCC 4.6.3] on linux2
>> Type "help", "copyright", "credits" or "license" for more information.
>>>>> import numpy
>>>>> numpy.__version__
>> '1.10.4'
>>>>> numpy.test()
>> Running unit tests for numpy
>> NumPy version 1.10.4
>> NumPy relaxed strides checking option: False
>> NumPy is installed in
>> /home/br/virtualenvs/manylinux/local/lib/python2.7/site-packages/numpy
>> Python version 2.7.3 (default, Jun 22 2015, 19:33:41) [GCC 4.6.3]
>> nose version 1.3.7
>>
>> 
>>
>> ==
>> ERROR: test_multiarray.TestNewBufferProtocol.test_relaxed_strides
>> --
>> Traceback (most recent call last):
>>   File 
>> "/home/br/virtualenvs/manylinux/local/lib/python2.7/site-packages/nose/case.py",
>> line 197, in runTest
>> self.test(*self.arg)
>>   File 
>> "/home/br/virtualenvs/manylinux/local/lib/python2.7/site-packages/numpy/core/tests/test_multiarray.py",
>> line 5366, in test_relaxed_strides
>> fd.write(c.data)
>> TypeError: 'buffer' does not have the buffer interface
>>
>> --
>>
>>
>> * Scipy tests pass with one error in TestNanFuncs, but the interpreter
>> crashes immediately afterwards.
>>
>>
>> Same machine, python 3.5: both numpy and scipy tests pass.
>
> Ouch - great that you found these, I'll take a look,

I think these are problems with numpy and Python 2.7.3 - because I got
the same "TypeError: 'buffer' does not have the buffer interface" on
numpy with OS X with Python.org python 2.7.3, installing from a wheel,
or installing from source.

I also get a scipy segfault with scipy 0.17.0 installed from an OSX
wheel, with output ending:

test_check_finite (test_basic.TestLstsq) ...
/Users/mb312/.virtualenvs/test/lib/python2.7/site-packages/scipy/linalg/basic.py:884:
RuntimeWarning: internal gelsd driver lwork query error, required
iwork dimension not returned. This is likely the result of LAPACK bug
0038, fixed in LAPACK 3.2.2 (released July 21, 2010). Falling back to
'gelss' driver.
  warnings.warn(mesg, RuntimeWarning)
ok
test_random_complex_exact

Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Matthew Brett
Hi Nadav,

On Sun, Feb 7, 2016 at 11:13 PM, Nathaniel Smith  wrote:
> (This is not relevant to the main topic of the thread, but FYI I think the
> recarray issues are fixed in 1.10.4.)
>
> On Feb 7, 2016 11:10 PM, "Nadav Horesh"  wrote:
>>
>> I have atlas-lapack-base installed via pacman (required by sagemath).
>> Since the numpy installation insisted on openblas on /usr/local, I got the
>> openblas source-code and installed  it on /usr/local.
>> BTW, I use 1.11b rather then 1.10.x since the 1.10 is very slow in
>> handling recarrays. For the tests I am erasing the 1.11 installation, and
>> installing the 1.10.4 wheel. I do verify that I have the right version
>> before running the tests, but I am not sure if there no unnoticed side
>> effects.
>>
>> Would it help if I put a side the openblas installation and rerun the
>> test?

Would you mind doing something like this, and posting the output?:

virtualenv test-manylinux
source test-manylinux/bin/activate
pip install -f https://nipy.bic.berkeley.edu/manylinux numpy==1.10.4 nose
python -c 'import numpy; numpy.test()'
python -c 'import numpy; print(numpy.__config__.show())'
deactivate

virtualenv test-from-source
source test-from-source/bin/activate
pip install numpy==1.10.4 nose
python -c 'import numpy; numpy.test()'
python -c 'import numpy; print(numpy.__config__.show())'
deactivate

I'm puzzled that the wheel gives a test error when the source install
does not, and my best guess was an openblas problem, but this just to
make sure we have the output from the exact same numpy version, at
least.

Thanks again,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Matthew Brett
Hi,

On Sun, Feb 7, 2016 at 2:06 AM, Nadav Horesh  wrote:
> The reult tests of numpy 1.10.4 installed from source:
>
> OK (KNOWNFAIL=4, SKIP=6)
>
>
> I think I use openblas, as it is installed instead the normal blas/cblas.

Thanks again for the further tests.

What do you get for:

python -c 'import numpy; print(numpy.__config__.show())'

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Matthew Brett
On Sun, Feb 7, 2016 at 10:09 PM, Nadav Horesh  wrote:
> Thank you fo reminding me, it is OK now:
> $ python -c 'import numpy; print(numpy.__config__.show())'
>
> lapack_opt_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> blas_mkl_info:
>   NOT AVAILABLE
> openblas_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> openblas_lapack_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> blas_opt_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> None
>
> I updated openblas to the latest version (0.2.15) and it pass the tests

Oh dear - now I'm confused.  So you installed the wheel, and tested
it, and it gave a test failure.  Then you updated openblas using
pacman, and then reran the tests against the wheel numpy, and they
passed?  That's a bit frightening - the wheel should only see its own
copy of openblas...

Thans for persisting,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-06 Thread Matthew Brett
Hi,

As some of you may have seen, Robert McGibbon and Nathaniel have just
guided a PEP for multi-distribution Linux wheels past the approval
process over on distutils-sig:

https://www.python.org/dev/peps/pep-0513/

The PEP includes a docker image on which y'all can build wheels which
match the PEP:

https://quay.io/repository/manylinux/manylinux

Now we're at the stage where we need stress-testing of the built
wheels to find any problems we hadn't thought of.

I've built numpy and scipy wheels here:

https://nipy.bic.berkeley.edu/manylinux/

So, if you have a Linux distribution handy, we would love to hear from
you about the results of testing these guys, maybe on the lines of:

pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy
python -c 'import numpy; numpy.test()'
python -c 'import scipy; scipy.test()'

These manylinux wheels should soon be available on pypi, and soon
after, installable with latest pip, so we would like to fix as many
problems as possible before going live.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-06 Thread Matthew Brett
On Sat, Feb 6, 2016 at 9:28 PM, Nadav Horesh  wrote:
> Test platform: python 3.4.1 on archlinux x86_64
>
> scipy test: OK
>
> OK (KNOWNFAIL=97, SKIP=1626)
>
>
> numpy tests: Failed on long double and int128 tests, and got one error:
>
> Traceback (most recent call last):
>   File "/usr/lib/python3.5/site-packages/nose/case.py", line 198, in runTest
> self.test(*self.arg)
>   File 
> "/usr/lib/python3.5/site-packages/numpy/core/tests/test_longdouble.py", line 
> 108, in test_fromstring_missing
> np.array([1]))
>   File "/usr/lib/python3.5/site-packages/numpy/testing/utils.py", line 296, 
> in assert_equal
> return assert_array_equal(actual, desired, err_msg, verbose)
>   File "/usr/lib/python3.5/site-packages/numpy/testing/utils.py", line 787, 
> in assert_array_equal
> verbose=verbose, header='Arrays are not equal')
>   File "/usr/lib/python3.5/site-packages/numpy/testing/utils.py", line 668, 
> in assert_array_compare
> raise AssertionError(msg)
> AssertionError:
> Arrays are not equal
>
> (shapes (6,), (1,) mismatch)
>  x: array([ 1., -1.,  3.,  4.,  5.,  6.])
>  y: array([1])
>
> --
> Ran 6019 tests in 28.029s
>
> FAILED (KNOWNFAIL=13, SKIP=12, errors=1, failures=18

Great - thanks so much for doing this.

Do you get a different error if you compile from source?

If you compile from source, do you link to OpenBLAS?

Thanks again,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.11.0b2 released

2016-01-28 Thread Matthew Brett
Hi,

On Thu, Jan 28, 2016 at 12:51 PM, Charles R Harris
 wrote:
> Hi All,
>
> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first beta
> was a damp squib due to missing files in the released source files, this
> release fixes that. The new source filese may be downloaded from
> sourceforge, no binaries will be released until the mingw tool chain
> problems are sorted.
>
> Please test and report any problem.

OSX wheels build OK:
https://travis-ci.org/MacPython/numpy-wheels/builds/105521850

Y'all can test with:

pip install --pre --trusted-host wheels.scipy.org -f
http://wheels.scipy.org numpy

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-15 Thread Matthew Brett
Hi,

On Fri, Jan 15, 2016 at 1:56 PM, Steve Waterbury
 wrote:
> On 01/15/2016 04:08 PM, Benjamin Root wrote:
>>
>> So, again, I love conda for what it can do when it works well. I only
>> take exception to the notion that it can address *all* problems, because
>> there are some problems that it just simply isn't properly situated for.
>
>
> Actually, I would say you didn't mention any ... ;)  The issue is
> not that it "isn't properly situated for" (whatever that means)
> the problems you describe, but that -- in the case you mention,
> for example -- no one has conda-packaged those solutions yet.
>
> FWIW, our sysadmins and I use conda for django / apache / mod_wsgi
> sites and we are very happy with it.  IMO, compiling mod_wsgi in
> the conda environment and keeping it up is trivial compared to the
> awkwardnesses introduced by using pip/virtualenv in those cases.
>
> We also use conda for sites with nginx and the conda-packaged
> uwsgi, which works great and even permits the use of a separate
> env (with, if necessary, different versions of django, etc.)
> for each application.  No need to set up an entire VM for each app!
> *My* sysadmins love conda -- as soon as they saw how much better
> than pip/virtualenv it was, they have never looked back.
>
> IMO, conda is by *far* the best packaging solution the python
> community has ever seen (and I have been using python for more
> than 20 years).

Yes, I think everyone would agree that, until recently, Python
packaging was in a mess.

> I too have been stunned by some of the resistance
> to conda that one sometimes sees in the python packaging world.

You can correct me if you see evidence to the contrary, but I think
all of the argument is that it is desirable that pip should work as
well.

> I've had a systems package maintainer tell me "it solves a
> different problem [than pip]" ... hmmm ... I would say it
> solves the same problem *and more*, *better*.

I know what you mean, but I suppose the person you were talking to may
have been thinking that many of us already have Python distributions
that we are using, and for those, we want to use pip.

> I attribute
> some of the conda-ignoring to "NIH" and, to some extent,
> possibly defensiveness (I would be defensive too if I had been
> working on pip as long as they had when conda came along ;).

I must say, I don't personally recognize those reasons.  For example,
I hadn't worked on pip at all before conda came along.

Best,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-15 Thread Matthew Brett
On Fri, Jan 15, 2016 at 2:15 PM, Steve Waterbury
<water...@pangalactic.us> wrote:
> On 01/15/2016 05:07 PM, Matthew Brett wrote:
>>>
>>> I attribute
>>> some of the conda-ignoring to "NIH" and, to some extent,
>>> possibly defensiveness (I would be defensive too if I had been
>>> working on pip as long as they had when conda came along ;).
>>
>>
>> I must say, I don't personally recognize those reasons.  For example,
>> I hadn't worked on pip at all before conda came along.
>
>
> By "working on pip", I was referring to *developers* of pip,
> not those who *use* pip for packaging things.  Are you
> contributing to the development of pip, or merely using it
> for creating packages?

Sorry - I assumed you were taking about us here on the list planning
to build Linux and Windows wheels.

It certainly doesn't seem surprising to me that the pip developers
would continue to develop pip rather than switch to conda.  Has there
been any attempt to persuade the pip developers to do this?

Best,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-15 Thread Matthew Brett
On Fri, Jan 15, 2016 at 2:33 PM, Steve Waterbury
<water...@pangalactic.us> wrote:
> On 01/15/2016 05:19 PM, Matthew Brett wrote:
>>
>> On Fri, Jan 15, 2016 at 2:15 PM, Steve Waterbury
>> <water...@pangalactic.us> wrote:
>>>
>>> On 01/15/2016 05:07 PM, Matthew Brett wrote:
>>>>>
>>>>>
>>>>> I attribute
>>>>> some of the conda-ignoring to "NIH" and, to some extent,
>>>>> possibly defensiveness (I would be defensive too if I had been
>>>>> working on pip as long as they had when conda came along ;).
>>>>
>>>>
>>>>
>>>> I must say, I don't personally recognize those reasons.  For example,
>>>> I hadn't worked on pip at all before conda came along.
>>>
>>>
>>>
>>> By "working on pip", I was referring to *developers* of pip,
>>> not those who *use* pip for packaging things.  Are you
>>> contributing to the development of pip, or merely using it
>>> for creating packages?
>>
>>
>> Sorry - I assumed you were taking about us here on the list planning
>> to build Linux and Windows wheels.
>
>
> No, I was definitely *not* talking about those on the list
> planning to build Linux and Windows wheels when I referred to
> the folks "working on pip".  However, that said, I *completely*
> agree with Travis's remark:
>
> "The other very real downside is that these efforts to promote
> numpy as wheels further encourages people to not use the
> better solution that already exists in conda."

I think there's a distinction between 'promote numpy as wheels' and
'make numpy available as a wheel'.  I don't think you'll see much
evidence of "promotion" here - it's not really the open-source way.
I'm not quite sure what you mean about 'circling the wagons', but the
general approach of staying on course and seeing how things shake out
seems to me entirely sensible.

Cheers too,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-14 Thread Matthew Brett
On Thu, Jan 14, 2016 at 9:14 AM, Chris Barker - NOAA Federal
 wrote:
>>> Also, you have the problem that there is one PyPi -- so where do you put
>>> your nifty wheels that depend on other binary wheels? you may need to fork
>>> every package you want to build :-(
>>
>> Is this a real problem or a theoretical one? Do you know of some
>> situation where this wheel to wheel dependency will occur that won't
>> just be solved in some other way?
>
> It's real -- at least during the whole bootstrapping period. Say I
> build a nifty hdf5 binary wheel -- I could probably just grab the name
> "libhdf5" on PyPI. So far so good. But the goal here would be to have
> netcdf and pytables and GDAL and who knows what else then link against
> that wheel. But those projects are all supported be different people,
> that all have their own distribution strategy. So where do I put
> binary wheels of each of those projects that depend on my libhdf5
> wheel? _maybe_ I would put it out there, and it would all grow
> organically, but neither the culture nor the tooling support that
> approach now, so I'm not very confident you could gather adoption.

I don't think there's a very large amount of cultural work - but some
to be sure.

We already have the following on OSX:

pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py

where all the wheels come from pypi.  So, I don't think this is really
outside our range, even if the problem is a little more difficult for
Linux.

> Even beyond the adoption period, sometimes you need to do stuff in
> more than one way -- look at the proliferation of channels on
> Anaconda.org.
>
> This is more likely to work if there is a good infrastructure for
> third parties to build and distribute the binaries -- e.g.
> Anaconda.org.

I thought that Anaconda.org allows pypi channels as well?

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-10 Thread Matthew Brett
On Sun, Jan 10, 2016 at 3:40 AM, Sandro Tosi <mo...@debian.org> wrote:
> On Sun, Jan 10, 2016 at 3:55 AM, Matthew Brett <matthew.br...@gmail.com> 
> wrote:
>> I updated the page with more on reasons to prefer Debian packages over
>> installing with pip:
>>
>> https://matthew-brett.github.io/pydagogue/installing_on_debian.html
>>
>> Is that enough to get the message across?
>
> That looks a lot better, thanks! I also kinda agree with all Nathaniel
> said on the matter

Good so.  You probably saw, I already removed use of sudo on the page as well.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-09 Thread Matthew Brett
Hi Sandro,

On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi  wrote:
>> I wrote a page on using pip with Debian / Ubuntu here :
>> https://matthew-brett.github.io/pydagogue/installing_on_debian.html
>
> Speaking with my numpy debian maintainer hat on, I would really
> appreciate if you dont suggest to use pip to install packages in
> Debian, or at least not as the only solution.

I'm very happy to accept alternative suggestions or PRs.

I know what you mean, but I can't yet see how to write a page that
would be good for explaining the benefits / tradeoffs of using deb
packages vs mainly or only pip packages vs a mix of the two.  Do you
have any thoughts?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-09 Thread Matthew Brett
On Sat, Jan 9, 2016 at 6:57 PM, Sandro Tosi <mo...@debian.org> wrote:
> On Sat, Jan 9, 2016 at 6:08 PM, Matthew Brett <matthew.br...@gmail.com> wrote:
>> Hi Sandro,
>>
>> On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi <mo...@debian.org> wrote:
>>>> I wrote a page on using pip with Debian / Ubuntu here :
>>>> https://matthew-brett.github.io/pydagogue/installing_on_debian.html
>>>
>>> Speaking with my numpy debian maintainer hat on, I would really
>>> appreciate if you dont suggest to use pip to install packages in
>>> Debian, or at least not as the only solution.
>>
>> I'm very happy to accept alternative suggestions or PRs.
>>
>> I know what you mean, but I can't yet see how to write a page that
>> would be good for explaining the benefits / tradeoffs of using deb
>> packages vs mainly or only pip packages vs a mix of the two.  Do you
>> have any thoughts?
>
> you can start by making extremely clear that this is not the Debian
> supported way to install python modules on a Debian system, that if a
> user uses pip to do it, it's very likely other applications or modules
> will fail, that if they have any problem with anything python related,
> they are on their own as they "broke" their system on purpose. thanks
> for considering

I updated the page with more on reasons to prefer Debian packages over
installing with pip:

https://matthew-brett.github.io/pydagogue/installing_on_debian.html

Is that enough to get the message across?

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-09 Thread Matthew Brett
On Sat, Jan 9, 2016 at 8:49 PM, Nathaniel Smith <n...@pobox.com> wrote:
> On Jan 9, 2016 10:09, "Matthew Brett" <matthew.br...@gmail.com> wrote:
>>
>> Hi Sandro,
>>
>> On Sat, Jan 9, 2016 at 4:44 AM, Sandro Tosi <mo...@debian.org> wrote:
>> >> I wrote a page on using pip with Debian / Ubuntu here :
>> >> https://matthew-brett.github.io/pydagogue/installing_on_debian.html
>> >
>> > Speaking with my numpy debian maintainer hat on, I would really
>> > appreciate if you dont suggest to use pip to install packages in
>> > Debian, or at least not as the only solution.
>>
>> I'm very happy to accept alternative suggestions or PRs.
>>
>> I know what you mean, but I can't yet see how to write a page that
>> would be good for explaining the benefits / tradeoffs of using deb
>> packages vs mainly or only pip packages vs a mix of the two.  Do you
>> have any thoughts?
>
> Why not replace all the "sudo pip" calls with "pip --user"? The trade offs
> between Debian-installed packages versus pip --user installed packages are
> subtle, and both are good options. Personal I'd generally recommend anyone
> actively developing python code to skip straight to pip for most things,
> since you'll eventually end up there anyway, but this is definitely
> debatable and situation dependent. On the other hand, "sudo pip"
> specifically is something I'd never recommend, and indeed has the potential
> to totally break your system.

Sure, but I don't think the page is suggesting doing ``sudo pip`` for
anything other than upgrading pip and virtualenv(wrapper) - and I
don't think that is likely to break the system.

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-08 Thread Matthew Brett
Hi,

On Fri, Jan 8, 2016 at 11:27 PM, Chris Barker  wrote:
> On Fri, Jan 8, 2016 at 1:58 PM, Robert McGibbon  wrote:
>>
>> I'm not sure if this is the right path for numpy or not,
>
>
> probably not -- AFAICT, the PyPa folks aren't interested in solving teh
> problems we have in the scipy community -- we can tweak around the edges,
> but we wont get there without a commitment to really solve the issues -- and
> if pip did that, it would essentially be conda -- non one wants to
> re-impliment conda.

Well - as the OP was implying, it really should not be too difficult.

We (here in Berkeley) have discussed how to do this for Linux,
including (Nathaniel mainly) what would be sensible for pypi to do, in
terms of platform labels.

Both Anaconda and Canopy build on a base default Linux system so that
the built binaries will work on many Linux systems.

At the moment, Linux wheels have the platform tag of either linux_i686
(32-bit) or linux_x86_64 - example filenames:

numpy-1.9.2-cp27-none-linux_i686.whl
numpy-1.9.2-cp27-none-linux_x86_64.whl

Obviously these platform tags are rather useless, because they don't
tell you very much about whether this wheel will work on your own
system.

If we started building Linux wheels on a base system like that of
Anaconda or Canopy we might like another platform tag that tells you
that this wheel is compatible with a wide range of systems.   So the
job of negotiating with distutils-sig is trying to find a good name
for this base system - we thought that 'manylinux' was a good one -
and then put in a pull request to pip to recognize 'manylinux' as
compatible when running pip install from a range of Linux systems.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-08 Thread Matthew Brett
On Sat, Jan 9, 2016 at 12:31 AM, Robert McGibbon  wrote:
>> Both Anaconda and Canopy build on a base default Linux system so that
>> the built binaries will work on many Linux systems.
>
> I think the base linux system is CentOS 5, and from my experience, it seems
> like this approach
> has worked very well. Those packages are compatible with all essentially all
> Linuxes that are
> more recent than CentOS 5 (which is ancient). I have not heard of anyone
> complaining that the
> packages they install through conda don't work on their CentOS 4 or Ubuntu
> 6.06 box. I assume
> Python / pip is probably used on a wider diversity of linux flavors than
> conda is, so I'm sure that
> binaries built on CentOS 5 won't work for absolutely _every_ linux user, but
> it does seem to
> cover the substantial majority of linux users.
>
> Building redistributable linux binaries that work across a large number of
> distros and distro
> versions is definitely tricky. If you run ``python setup.py bdist_wheel`` on
> your Fedora Rawhide
> box, you can't really expect the wheel to work for too many other linux
> users. So given that, I
> can see why PyPI would want to be careful about accepting Linux wheels.
>
> But it seems like, if they make the upload something like
>
> ```
> twine upload numpy-1.9.2-cp27-none-linux_x86_64.whl \
> --yes-yes-i-know-this-is-dangerous-but-i-know-what-i'm-doing
> ```
>
> that this would potentially be able to let packages like numpy serve their
> linux
> users better without risking too much junk being uploaded to PyPI.

I could well understand it if the pypa folks thought that was a bad
idea.  There are so many Linux distributions and therefore so many
ways for this to go wrong, that the most likely outcome would be a
relentless flood of people saying "ouch this doesn't work for me", and
therefore decreased trust for pip / Linux in general.

On the other hand, having a base build system and matching platform
tag seems like it is well within reach, and, if we provided proof of
concept, I guess that pypa would agree.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-08 Thread Matthew Brett
Hi,

On Fri, Jan 8, 2016 at 4:28 PM, Yuxiang Wang  wrote:
> Dear Nathaniel,
>
> Gotcha. That's very helpful. Thank you so much!
>
> Shawn
>
> On Thu, Jan 7, 2016 at 10:01 PM, Nathaniel Smith  wrote:
>> On Thu, Jan 7, 2016 at 6:18 PM, Yuxiang Wang  wrote:
>>> Dear all,
>>>
>>> I know that in Windows, we should use either Christoph's package or
>>> Anaconda for MKL-optimized numpy. In Linux, the fortran compiler issue
>>> is solved, so should I directly used pip install numpy to get numpy
>>> with a reasonable BLAS library?
>>
>> pip install numpy should work fine; whether it gives you a reasonable
>> BLAS library will depend on whether you have the development files for
>> a reasonable BLAS library installed, and whether numpy's build system
>> is able to automatically locate them. Generally this means that if
>> you're on a regular distribution and remember to install a decent BLAS
>> -dev or -devel package, then you'll be fine.
>>
>> On Debian/Ubuntu, 'apt install libopenblas-dev' is probably enough to
>> ensure something reasonable happens.
>>
>> Anaconda is also an option on linux if you want MKL (or openblas).

I wrote a page on using pip with Debian / Ubuntu here :
https://matthew-brett.github.io/pydagogue/installing_on_debian.html

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] ANN: second release candidate for scipy 0.17.0

2016-01-07 Thread Matthew Brett
Hi,

On Thu, Jan 7, 2016 at 11:24 PM, Evgeni Burovski
 wrote:
> Hi,
>
> I'm pleased to announce the availability of the second release
> candidate for Scipy 0.17.0. It's two days ahead of the original
> schedule: based on typical development patterns, I'd like to have two
> weekends and a full working week before January 17th, when this rc2 is
> supposed to become the final release.
>
> Please try this rc and report any issues on Github tracker or
> scipy-dev mailing list.
> Source tarballs and full release notes are available from Github
> Releases: https://github.com/scipy/scipy/releases/tag/v0.17.0rc2
>
> Compared to rc1, the following PRs were merged:
>
> - - `#5624 `__: FIX: Fix interpolate
> - - `#5625 `__: BUG: msvc9
> binaries crash when indexing std::vector of size 0
> - - `#5635 `__: BUG:
> misspelled __dealloc__ in cKDTree.
> - - `#5642 `__: STY: minor
> fixup of formatting of 0.17.0 release notes.
> - - `#5643 `__: BLD: fix a
> build issue in special/Faddeeva.cc with isnan.
> - - `#5661 `__: TST: linalg
> tests used stdlib random instead of numpy.random.
>
> There is a bit of an unusual change in this rc. In short, in testing
> rc1 we found that 1) interpolate.interp1d had an undocumented feature
> of allowing an array-valued fill_value to be broadcast against the
> data being interpolated, and  2) we broke it in 0.17.0rc1. (See PR
> 5624 for details.)
> We now *think* we fixed it so that no existing code is broken.
> Nevertheless, I would like to encourage everyone to test their code
> against this rc2, especially all code which uses interp1d.

Thanks for doing this.  Testing builds of OSX wheels via
https://github.com/MacPython/scipy-wheels, I found this problem :
https://github.com/scipy/scipy/issues/5689

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Windows build/distribute plan & MingwPy funding

2016-01-04 Thread Matthew Brett
Hi,

On Mon, Jan 4, 2016 at 5:35 PM, Erik Bray  wrote:
> On Sat, Jan 2, 2016 at 3:20 AM, Ralf Gommers  wrote:
>> Hi all,
>>
>> You probably know that building Numpy, Scipy and the rest of the Scipy Stack
>> on Windows is problematic. And that there are plans to adopt the static
>> MinGW-w64 based toolchain that Carl Kleffner has done a lot of work on for
>> the last two years to fix that situation.
>>
>> The good news is: this has become a lot more concrete just now, with this
>> proposal for funding: http://mingwpy.github.io/proposal_december2015.html
>>
>> Funding for phases 1 and 2 is already confirmed; the phase 3 part has been
>> submitted to the PSF. Phase 1 (of $1000) is funded by donations made to
>> Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS
>> directly. So a big thank you to everyone who made a donation to Numpy, Scipy
>> and NumFOCUS!
>>
>> I hope that that proposal gives a clear idea of the work that's going to be
>> done over the next months. Note that the http://mingwpy.github.io contains a
>> lot more background info, description of technical issues, etc.
>>
>> Feedback & ideas very welcome of course!
>>
>> Cheers,
>> Ralf
>
> Hi Ralph,
>
> I've seen you drop hints about this recently, and am interested to
> follow this work.  I've been hired as part of the OpenDreamKit project
> to work, in large part, on developing a sensible toolchain for
> building and distributing Sage on Windows.  I know you've been
> following the thread on that too.  Although the primary goal there is
> "whatever works", I'm personally inclined to focus on the mingwpy /
> mingw-w64 approach, due in large part with my past success with the
> MinGW 32-bit toolchain.  (I personally have a desire to improve
> support for building with MSVC as well, but that's a less important
> goal as far as the funding is concerned.)
>
> So anyways, please keep me in the loop about this, as I will also be
> putting effort into this over the next year as well.  Has there been
> any discussion about setting up a mailing list specifically for this
> project?

Yes, it exists already, but not well advertised :
https://groups.google.com/forum/#!forum/mingwpy

It would be great to share work.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.10.3 release.

2016-01-02 Thread Matthew Brett
Hi,

On Sat, Jan 2, 2016 at 9:47 PM, Charles R Harris
 wrote:
> Hi All,
>
> A significant segfault problem has been reported against Numpy 1.10.2 and I
> want to make a quick 1.10.3 release to get it fixed. Two questions
>
> What exactly is the release process that has been decided on? AFAIK, I
> should just do a source release on Sourceforge, ping Matthew to produce
> wheels for Mac and wait for him to put them on pypi, and then upload the
> sources to pypi.

You're welcome to ping me to do the Mac wheels, but I'd be even
happier if you could have a go at triggering the build, to see if you
can make it work:

https://github.com/MacPython/numpy-wheels

Cheers,

Mattthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.10.3 release.

2016-01-02 Thread Matthew Brett
On Sun, Jan 3, 2016 at 12:01 AM, Charles R Harris
<charlesr.har...@gmail.com> wrote:
>
>
> On Sat, Jan 2, 2016 at 4:51 PM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
>>
>> Hi,
>>
>> On Sat, Jan 2, 2016 at 9:47 PM, Charles R Harris
>> <charlesr.har...@gmail.com> wrote:
>> > Hi All,
>> >
>> > A significant segfault problem has been reported against Numpy 1.10.2
>> > and I
>> > want to make a quick 1.10.3 release to get it fixed. Two questions
>> >
>> > What exactly is the release process that has been decided on? AFAIK, I
>> > should just do a source release on Sourceforge, ping Matthew to produce
>> > wheels for Mac and wait for him to put them on pypi, and then upload the
>> > sources to pypi.
>>
>> You're welcome to ping me to do the Mac wheels, but I'd be even
>> happier if you could have a go at triggering the build, to see if you
>> can make it work:
>>
>> https://github.com/MacPython/numpy-wheels
>
>
> Hmm... Since it needs to build off the tag to have the correct behavior, I
> assume that that needs to be set explicitly.

No, the build system uses a script I wrote to find the tag closest in
development history to master, by default - see :
http://stackoverflow.com/a/24557377/1939576

So, if you push a tag, then trigger a build, there's a good chance
that will do what you want.  If not, then you can set the tag or
commit explicitly at :

https://github.com/MacPython/numpy-wheels/blob/master/.travis.yml#L5

> Also, it didn't look as if your
> last pypi uploads were signed. Did you upload with twine?

I did yes - with a custom uploader script [1], but I guess it would be
better to download, sign, then upload again using some other mechanism
that preserves the signature.

Cheers,

Matthew

[1] https://github.com/MacPython/terryfy/blob/master/wheel-uploader
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


  1   2   3   4   5   6   7   8   9   10   >