[Numpy-discussion] migration of all scipy.org mailing lists

2017-03-23 Thread Ralf Gommers
Hi all,

The server for the scipy.org mailing list is in very bad shape, so we (led
by Didrik Pinte) are planning to complete the migration of active mailing
lists to the python.org infrastructure and to decommission the lists than
seem dormant/obsolete.

The scipy-user mailing list was already moved to python.org a month or two
ago, and that migration went smoothly.

These are the lists we plan to migrate:

astropy
ipython-dev
ipython-user
numpy-discussion
numpy-svn
scipy-dev
scipy-organizers
scipy-svn

And these we plan to retire:

Announce
APUG
Ipython-tickets
Mailman
numpy-refactor
numpy-refactor-git
numpy-tickets
Pyxg
scipy-tickets
NiPy-devel

This will happen asap, likely within a day or two. So two requests:
1.  If you see any issue with this plan, please reply and keep Didrik and
myself on Cc (we are not subscribed to all lists).
2. If you see this message on a numpy/scipy list, but not on another list
(could be due to a moderation queue) then please forward this message again
to that other list.

Thanks,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] migration of all scipy.org mailing lists

2017-03-23 Thread Ralf Gommers
On Thu, Mar 23, 2017 at 3:11 AM, Nathan Goldbaum 
wrote:

> Are you sure about astropy? They recently moved to google groups.


https://mail.scipy.org/pipermail/astropy/ shows messages as recent as
today. So I'm sure I think (but happy to be corrected if moving to
python.org is not desired for whatever reason).

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] migration of all scipy.org mailing lists

2017-03-22 Thread Ralf Gommers
On Thu, Mar 23, 2017 at 12:18 AM, Neal Becker  wrote:

> Has anyone taken care of notifying gmane about this?
>

We will have to update this info in quite a few places after the move is
done. Including Gmane, although that site hasn't been working for half a
year so is pretty low on the priority list.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] migration of all scipy.org mailing lists

2017-03-22 Thread Ralf Gommers
(and now with Didrik on Cc - apologies)



On Wed, Mar 22, 2017 at 10:23 PM, Ralf Gommers <ralf.gomm...@gmail.com>
wrote:

> Hi all,
>
> The server for the scipy.org mailing list is in very bad shape, so we
> (led by Didrik Pinte) are planning to complete the migration of active
> mailing lists to the python.org infrastructure and to decommission the
> lists than seem dormant/obsolete.
>
> The scipy-user mailing list was already moved to python.org a month or
> two ago, and that migration went smoothly.
>
> These are the lists we plan to migrate:
>
> astropy
> ipython-dev
> ipython-user
> numpy-discussion
> numpy-svn
> scipy-dev
> scipy-organizers
> scipy-svn
>
> And these we plan to retire:
>
> Announce
> APUG
> Ipython-tickets
> Mailman
> numpy-refactor
> numpy-refactor-git
> numpy-tickets
> Pyxg
> scipy-tickets
> NiPy-devel
>
> There will be a short period (<24 hours) where messages to the list may be
> refused, with an informative message as to why. The mailing list address
> will change from listn...@scipy.org to listn...@python.org
>
> This will happen asap, likely within a day or two. So two requests:
> 1.  If you see any issue with this plan, please reply and keep Didrik and
> myself on Cc (we are not subscribed to all lists).
> 2. If you see this message on a numpy/scipy list, but not on another list
> (could be due to a moderation queue) then please forward this message again
> to that other list.
>
> Thanks,
> Ralf
>
>
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] migration of all scipy.org mailing lists

2017-03-22 Thread Ralf Gommers
Hi all,

The server for the scipy.org mailing list is in very bad shape, so we (led
by Didrik Pinte) are planning to complete the migration of active mailing
lists to the python.org infrastructure and to decommission the lists than
seem dormant/obsolete.

The scipy-user mailing list was already moved to python.org a month or two
ago, and that migration went smoothly.

These are the lists we plan to migrate:

astropy
ipython-dev
ipython-user
numpy-discussion
numpy-svn
scipy-dev
scipy-organizers
scipy-svn

And these we plan to retire:

Announce
APUG
Ipython-tickets
Mailman
numpy-refactor
numpy-refactor-git
numpy-tickets
Pyxg
scipy-tickets
NiPy-devel

There will be a short period (<24 hours) where messages to the list may be
refused, with an informative message as to why. The mailing list address
will change from listn...@scipy.org to listn...@python.org

This will happen asap, likely within a day or two. So two requests:
1.  If you see any issue with this plan, please reply and keep Didrik and
myself on Cc (we are not subscribed to all lists).
2. If you see this message on a numpy/scipy list, but not on another list
(could be due to a moderation queue) then please forward this message again
to that other list.

Thanks,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Move scipy.org docs to Github?

2017-03-15 Thread Ralf Gommers
On Wed, Mar 15, 2017 at 3:21 PM, Matthew Brett 
wrote:

> Hi,
>
> The scipy.org site is down at the moment, and has been for more than 36
> hours:
>
> https://github.com/numpy/numpy/issues/8779#issue-213781439
>
> This has happened before:
>
> https://github.com/scipy/scipy.org/issues/187#issue-186426408
>
> I think it was down for about 24 hours that time.
>
> From the number of people opening issues or commenting on the
> scipy.org website this time, it seems to be causing quite a bit of
> disruption.
>
> It seems to me that we would have a much better chances of avoiding
> significant down-time, if we switched to hosting the docs on github
> pages.
>
> What do y'all think?
>

Once the site is back up we should look at migrating to a better (hosted)
infrastructure. I suspect that Github Pages won't work, we'll exceed or be
close to exceeding both the 1 GB site size limit and the 100 GB/month
bandwidth limit [1].

Rough bandwidth estimate (using page size from
http://scipy.github.io/devdocs/ and Alexa stats): 2 million visits per
month, 2.5 page views per visit, 5 kb/page = 25 GB/month (html). Add to
that pdf docs, which are ~20 MB in size: if only a small fraction of
visitors download those, we'll be at >100 GB.

Ralf

[1] https://help.github.com/articles/what-is-github-pages/#usage-limits
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] docs.scipy.org down

2017-03-13 Thread Ralf Gommers
On Tue, Mar 14, 2017 at 8:16 AM, Ryan May  wrote:

> Is https://docs.scipy.org/ being down known issue?
>

It is. Is being worked on; tracking issue is
https://github.com/numpy/numpy/issues/8779

Thanks for reporting,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Daily numpy wheel builds - prefer to per-commit builds?

2017-02-28 Thread Ralf Gommers
On Tue, Feb 28, 2017 at 9:43 PM, Matthew Brett 
wrote:

> Hi,
>
> I've been working to get daily travis-ci cron-job manylinux builds
> working for numpy and scipy wheels.   They are now working OK:
>
> https://travis-ci.org/MacPython/numpy-wheels
> https://travis-ci.org/MacPython/scipy-wheels
> https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a
> 83.ssl.cf2.rackcdn.com
>
> These are daily builds of the respective numpy and scipy master branches.
>
> Numpy already has a system, kindly worked up by Olivier Grisel, which
> uploads wheel builds from each travis-ci run. travis-ci uploads these
> wheels for every commit, at the
> "travis-dev-wheels" container on Rackspace, visible at
> https://f66d8a5767b134cb96d3-4ffdece11fd3f72855e4665bc61c74
> 45.ssl.cf2.rackcdn.com
> . These builds are specific to the Linux they were built on - in this
> case Ubuntu 12.04.
>
> Some projects use these per-commit builds for testing compatibility
> with development Numpy code.
>
> Now I'm wondering what the relationship should be between the current
> every-commit builds and the new wheel builds.
>
> I think that we should prefer the new wheel builds and deprecate the
> previous per-commit builds, because:
>
> * the new manylinux wheels are self-contained, and so can be installed
> with pip without extra lines of `apt` installs;
> * the manylinux builds work on any travis container, not just the current
> 12.04 container;
> * manylinux builds should be faster, as they are linked against OpenBLAS;
> * manylinux wheels are closer to the wheels we distribute for
> releases, and therefore more useful for testing against.
>
> What do y'all think?
>

Uploading your daily Linux and OS X builds and just turning off uploads of
the per-commit builds sounds like an improvement to me.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Could we simplify backporting?

2017-02-22 Thread Ralf Gommers
On Wed, Feb 22, 2017 at 7:52 AM, Marten van Kerkwijk <
m.h.vankerkw...@gmail.com> wrote:

> Hi All,
>
> In gh-8594, a question came up how to mark things that should be
> backported and Chuck commented [1]:
>
> > Our backport policy is still somewhat ad hoc, especially as I the only
> one who has been doing release. What I currently do is set the milestone to
> the earlier version, so I will find the PR when looking for backports, then
> do a backport, label it as such, set the milestone on the backported
> version, and remove the milestone from the original. I'm not completely
> happy with the process, so if you have better ideas I'd like to hear them.
> One option I've considered is a `backported` label in addition to the
> `backport` label, then use the latter for things to be backported.
>

I really don't like the double work and the large amount of noise coming
from backporting every other PR to NumPy very quickly. For SciPy the policy
is:
  - anyone can set the "backport-candidate" label
  - the release manager backports, usually a bunch in one go
  - only important fixes get backported (involves some judging, but things
like silencing warnings, doc fixes, etc. are not important enough)

This works well, and I'd hope that we can make the NumPy approach similar.

It seems that continuing to set the milestone to a bug-release version
> if required was a good idea; it is little effort to anyone and keeps
> things clear.


+1

For the rest, might it be possible to make things more
> automated? E.g., might it be possible to have some travis magic that
> does a trial merge & test?


Not sure how you would deal with merge conflicts on cherry-picks in an
automatic backport thingy?

Could one somehow merge to multiple
> branches at the same time?
>

Don't think so.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy Development Queries

2017-02-20 Thread Ralf Gommers
On Tue, Feb 21, 2017 at 7:32 AM, ashwin.pathak <
ashwin.pat...@students.iiit.ac.in> wrote:

> Hello all,
> I am new to this organization and wanted to start with some easy-fix
> issues to get some knowledge about the soruce code. However, the issues
> under easy-fix labels have already been solved or someone is at it. Can
> someone help me find such issues?
>

Hi Ashwin, welcome. I don't want to seem discouraging, but I do want to
explain that NumPy is significantly harder to get started on than SciPy
(which you've started on already) as a newcomer to the scientific Python
ecosystem. So I'd encourage you to spend some more time on the SciPy issues
- there are more easy-fix ones there, and the process of contributing (pull
requests, reviews, finding your way around the codebase) is similar for the
two projects.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Eric Wieser added to NumPy team.

2017-02-19 Thread Ralf Gommers
On Sun, Feb 19, 2017 at 7:07 AM, Charles R Harris  wrote:

> Hi All,
>
> I'm pleased to welcome Eric to the NumPy team. There is a pile of pending
> PRs that grows every day and we are counting on Eric will help us keep it
> in check ;)
>

Welcome to the team Eric!

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-16 Thread Ralf Gommers
On Thu, Feb 16, 2017 at 3:53 PM, Sandro Tosi  wrote:

> > A recent post to the wheel-builders mailing list pointed out some
> > links to places providing free PowerPC hosting for open source
> > projects, if they agree to a submitted request:
>
> The debian project has some powerpc machines (and we still build numpy
> on those boxes when i upload a new revision to our archives) and they
> also have hosts dedicated to let debian developers login and debug
> issues with their packages on that architecture. I can sponsor access
> to those machines for some of you, but it is not a place where you can
> host a CI instance.
>
> Just keep it in mind more broadly than powerpc, f.e. these are all the
> archs where numpy was built after the last upload
> https://buildd.debian.org/status/package.php?p=python-numpy=unstable
> (the grayed out archs are the ones non release critical, so packages
> are built as best effort and if missing is not a big deal)


Thanks Sandro. It looks like even for the release-critical ones, it's just
the build that has to succeed and failures are not detected? For example,
armel is green but has 9 failures:
https://buildd.debian.org/status/fetch.php?pkg=python-numpy=armel=1%3A1.12.0-2=1484889563=0

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-16 Thread Ralf Gommers
On Thu, Feb 16, 2017 at 8:58 AM, Matthew Brett <matthew.br...@gmail.com>
wrote:

> On Wed, Feb 15, 2017 at 7:55 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >
> >
> > On Thu, Feb 16, 2017 at 8:45 AM, Matthew Brett <matthew.br...@gmail.com>
> > wrote:
> >>
> >> On Wed, Feb 15, 2017 at 7:37 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> >> wrote:
> >> >
> >> >
> >> > On Thu, Feb 16, 2017 at 8:02 AM, Matthew Brett <
> matthew.br...@gmail.com>
> >> > wrote:
> >> >>
> >> >> Hey,
> >> >>
> >> >> A recent post to the wheel-builders mailing list pointed out some
> >> >> links to places providing free PowerPC hosting for open source
> >> >> projects, if they agree to a submitted request:
> >> >>
> >> >>
> >> >> https://mail.python.org/pipermail/wheel-builders/2017-
> February/000257.html
> >> >>
> >> >> It would be good to get some testing going on these architectures.
> >> >> Shall we apply for hosting, as the numpy organization?
> >> >
> >> >
> >> > Those are bare VMs it seems. Remembering the Buildbot and Mailman
> >> > horrors, I
> >> > think we should be very reluctant to taking responsibility for
> >> > maintaining
> >> > CI on anything that's not hosted and can be controlled with a simple
> >> > config
> >> > file in our repo.
> >>
> >> Not sure what you mean about mailman - maybe the Enthought servers we
> >> didn't have access to?
> >
> >
> > We did have access (for most of the time), it's just that no one is
> > interested in putting in lots of hours on sysadmin duties.
> >
> >>
> >> For buildbot, I've been maintaining about 12
> >> crappy old machines for about 7 years now [1] - I'm happy to do the
> >> same job for a couple of properly hosted PPC machines.
> >
> >
> > That's awesome persistence. The NumPy and SciPy buildbots certainly
> weren't
> > maintained like that, half of them were offline or broken for long
> periods
> > usually.
>
> Right - they do need persistence, and to have someone who takes
> responsibility for them.
>
> >>
> >>  At least we'd
> >> have some way of testing for these machines, if we get stuck - even if
> >> that involved spinning up a VM and installing the stuff we needed from
> >> the command line.
> >
> >
> > I do see the value of testing on more platforms of course. It's just
> about
> > logistics/responsibilities. If you're saying that you'll do the
> maintenance,
> > and want to apply for resources using the NumPy name, that's much better
> I
> > think then making "the numpy devs" collectively responsible.
>
> Yes, exactly.  I'm happy to take responsibility for them, I just
> wanted to make sure that numpy devs could get at them if I'm not
> around for some reason.
>

In that case, +1 from me!

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-15 Thread Ralf Gommers
On Thu, Feb 16, 2017 at 8:45 AM, Matthew Brett <matthew.br...@gmail.com>
wrote:

> On Wed, Feb 15, 2017 at 7:37 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >
> >
> > On Thu, Feb 16, 2017 at 8:02 AM, Matthew Brett <matthew.br...@gmail.com>
> > wrote:
> >>
> >> Hey,
> >>
> >> A recent post to the wheel-builders mailing list pointed out some
> >> links to places providing free PowerPC hosting for open source
> >> projects, if they agree to a submitted request:
> >>
> >> https://mail.python.org/pipermail/wheel-builders/2017-
> February/000257.html
> >>
> >> It would be good to get some testing going on these architectures.
> >> Shall we apply for hosting, as the numpy organization?
> >
> >
> > Those are bare VMs it seems. Remembering the Buildbot and Mailman
> horrors, I
> > think we should be very reluctant to taking responsibility for
> maintaining
> > CI on anything that's not hosted and can be controlled with a simple
> config
> > file in our repo.
>
> Not sure what you mean about mailman - maybe the Enthought servers we
> didn't have access to?


We did have access (for most of the time), it's just that no one is
interested in putting in lots of hours on sysadmin duties.


> For buildbot, I've been maintaining about 12
> crappy old machines for about 7 years now [1] - I'm happy to do the
> same job for a couple of properly hosted PPC machines.


That's awesome persistence. The NumPy and SciPy buildbots certainly weren't
maintained like that, half of them were offline or broken for long periods
usually.


>  At least we'd
> have some way of testing for these machines, if we get stuck - even if
> that involved spinning up a VM and installing the stuff we needed from
> the command line.
>

I do see the value of testing on more platforms of course. It's just about
logistics/responsibilities. If you're saying that you'll do the
maintenance, and want to apply for resources using the NumPy name, that's
much better I think then making "the numpy devs" collectively responsible.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PowerPC testing servers

2017-02-15 Thread Ralf Gommers
On Thu, Feb 16, 2017 at 8:02 AM, Matthew Brett 
wrote:

> Hey,
>
> A recent post to the wheel-builders mailing list pointed out some
> links to places providing free PowerPC hosting for open source
> projects, if they agree to a submitted request:
>
> https://mail.python.org/pipermail/wheel-builders/2017-February/000257.html
>
> It would be good to get some testing going on these architectures.
> Shall we apply for hosting, as the numpy organization?
>

Those are bare VMs it seems. Remembering the Buildbot and Mailman horrors,
I think we should be very reluctant to taking responsibility for
maintaining CI on anything that's not hosted and can be controlled with a
simple config file in our repo.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Building external c modules with mingw64 / numpy

2017-02-14 Thread Ralf Gommers
On Sat, Jan 21, 2017 at 9:23 PM, Schnizer, Pierre <
pierre.schni...@helmholtz-berlin.de> wrote:

> Dear all,
>
>
>
>I built  an external c-module (pygsl) using mingw 64 from
> msys2 mingw64-gcc compiler.
>
>
>
> This  built required  some changes  to numpy.distutils to get the
>
> “python setup.py config”
>
> and
>
> “python setup.py build”
>
> working. In this process  I replaced  2 files in numpy.distutils  from
> numpy git repository:
>
> -  numpy.dist_utils.misc_utils.py version ec0e046
> on
> 14 Dec 2016
>
> -  numpy.dist_utils. *mingw32ccompiler.py version *ec0e046
> on
> 14 Dec 2016
>
>
>
> mingw32ccompiler.py required to be modified to get it work
>
> n  preprocessor  had to be defined  as I am using  setup.py config
>
> n  specifying the runtime library search path to the linker
>
> n  include path  of the vcrtruntime
>
>
>
> I attached a patch reflecting the changes  I had to make  to file
> mingw32ccompile.py
>
> If this information is useful I am  happy to answer questions
>

Thanks for the patch Pierre. For future reference: a pull request on GitHub
or a link to a Gist is preferred for us and usually gets you a response
quicker.

Regarding your question in the patch on including Python's install
directory: that shouldn't be necessary, and I'd be wary of applying your
patch without understanding why the current numpy.distutils code doesn't
work for you. But if your patch works for you then it can't hurt I think.

Cheers,
Ralf



>
>
> Sincerely yours
>
>Pierre
>
>
>
> PS  Version infos:
>
> Python:
>
> Python 3.6.0 (v3.6.0:41df79263a11, Dec 23 2016, 08:06:12) [MSC v.1900 64
> bit (AMD64)] on win32
>
>
>
> Numpy:
>
> >> help(numpy.version)
>
> Help on module numpy.version in numpy:
>
> DATA
>
> full_version = '1.12.0'
>
> git_revision = '561f1accf861ad8606ea2dd723d2be2b09a2dffa'
>
> release = True
>
> short_version = '1.12.0'
>
> version = '1.12.0'
>
>
>
> gcc.exe (Rev2, Built by MSYS2 project) 6.2.0
>
>
>
>
>
> --
>
> Helmholtz-Zentrum Berlin für Materialien und Energie GmbH
>
> Mitglied der Hermann von Helmholtz-Gemeinschaft Deutscher
> Forschungszentren e.V.
>
> Aufsichtsrat: Vorsitzender Dr. Karl Eugen Huthmacher, stv. Vorsitzende Dr.
> Jutta Koch-Unterseher
> Geschäftsführung: Prof. Dr. Anke Rita Kaysser-Pyzalla, Thomas Frederking
>
> Sitz Berlin, AG Charlottenburg, 89 HRB 5583
>
> Postadresse:
> Hahn-Meitner-Platz 1
> D-14109 Berlin
>
> http://www.helmholtz-berlin.de
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Marten van Kerkwijk added to numpy team.

2017-02-14 Thread Ralf Gommers
On Tue, Feb 14, 2017 at 10:01 AM, Charles R Harris <
charlesr.har...@gmail.com> wrote:

> Hi All,
>
> I'm pleased to welcome Marten to the numpy team. His reviews of PRs have
> been very useful in the past and I am happy that he has accepted our
> invitation to join the team.
>

Excellent, welcome Marten!

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] GSoC 2017: NumFocus will be an umbrella organization

2017-01-18 Thread Ralf Gommers
Hi Max,

On Tue, Jan 17, 2017 at 2:38 AM, Max Linke  wrote:

> Hi
>
> Organizations can start submitting applications for Google Summer of Code
> 2017 on January 19 (and the deadline is February 9)
>
> https://developers.google.com/open-source/gsoc/timeline?hl=en


Thanks for bringing this up, and for organizing the NumFOCUS participation!


> NumFOCUS will be applying again this year. If you want to work with us
> please let me know and if you apply as an organization yourself or under a
> different umbrella organization please tell me as well.


I suspect we won't participate at all, but if we do then it's likely under
the PSF umbrella as we have done previously.

@all: in practice working on NumPy is just far too hard for most GSoC
students. Previous years we've registered and generated ideas, but not
gotten any students. We're also short on maintainer capacity. So I propose
to not participate this year.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy 1.12.0 release

2017-01-16 Thread Ralf Gommers
On Mon, Jan 16, 2017 at 12:43 PM, Charles R Harris <
charlesr.har...@gmail.com> wrote:

> Hi All,
>
> I'm pleased to announce the NumPy 1.12.0 release. This release supports
> Python 2.7 and 3.4-3.6. Wheels for all supported Python versions may be
> downloaded from PiPY
> , the tarball
> and zip files may be downloaded from Github
> . The release notes
> and files hashes may also be found at Github
>  .
>
> NumPy 1.12.0rc 2 is the result of 418 pull requests submitted by 139
> contributors and comprises a large number of fixes and improvements. Among
> the many improvements it is difficult to  pick out just a few as standing
> above the others, but the following may be of particular interest or
> indicate areas likely to have future consequences.
>
> * Order of operations in ``np.einsum`` can now be optimized for large
> speed improvements.
> * New ``signature`` argument to ``np.vectorize`` for vectorizing with core
> dimensions.
> * The ``keepdims`` argument was added to many functions.
> * New context manager for testing warnings
> * Support for BLIS in numpy.distutils
> * Much improved support for PyPy (not yet finished)
>

Thanks for all the heavy lifting on this one Chuck!

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: Backslash operator A\b and np/sp.linalg.solve

2017-01-09 Thread Ralf Gommers
On Mon, Jan 9, 2017 at 4:17 AM, Ilhan Polat  wrote:

>
> Hi everyone,
>
> I was stalking the deprecating the numpy.matrix discussion on the other
> thread and I wondered maybe the mailing list is a better place for the
> discussion about something I've been meaning to ask the dev members. I
> thought mailing lists are something we dumped using together with ICQ and
> geocities stuff but apparently not :-)
>
> Anyways, first thing is first: I have been in need of the ill-conditioned
> warning behavior of matlab (and possibly other software suites) for my own
> work. So I looked around in the numpy issues and found
> https://github.com/numpy/numpy/issues/3755 some time ago. Then I've
> learned from @rkern that there were C translations involved in the numpy
> source and frankly I couldn't even find the entry point of how the project
> is structured so I've switched to SciPy side where things are a bit more
> convenient. Next to teaching me more about f2py machinery, I have noticed
> that the linear algebra module is a bit less competitive than the usual
> level of scipy though it is definitely a personal opinion.
>
> So in order to get the ill-conditioning (or at least the condition number)
> I've wrapped up a PR using the expert routines of LAPACK (which is I think
> ready to merge) but still it is far from the contemporary software
> convenience that you generally get.
>
> https://github.com/scipy/scipy/pull/6775
>
> The "assume_a" keyword introduced here is hopefully modular enough that
> should there be any need for more structures we can simply keep adding to
> the list without any backwards compatibility. It will be at least offering
> more options than what we have currently. The part that I would like to
> discuss requires a bit of intro so please bear with me. Let me copy/paste
> the part from the old PR:
>
>   Around many places online, we can witness the rant about numpy/scipy not
> letting the users know about the conditioning for example Mike Croucher's
> blog  and numpy/numpy#3755
> 
>
>   Since we don't have any central backslash function that optimizes
> depending on the input data, should we create a function, let's name it
> with the matlab equivalent for the time being linsolve such that it
> automatically calls for the right solver? This way, we can avoid writing
> new functions for each LAPACK driver . As a reference here is a SO thread
> 
> that summarizes the linsolve
>  functionality.
>

Note that you're proposing a new scipy feature (right?) on the numpy
list

This sounds like a good idea to me. As a former heavy Matlab user I
remember a lot of things to dislike, but "\" behavior was quite nice.


> I'm sure you are aware, but just for completeness, the linear equation
> solvers are often built around the concept of polyalgorithm which is a
> fancy way of saying that the array is tested consecutively for certain
> structures and the checks are ordered in such a way that the simpler
> structure is tested the sooner. E.g. first check for diagonal matrix, then
> for upper/lower triangular then permuted triangular then symmetrical and so
> on. Here is also another example from AdvanPix
> http://www.advanpix.com/2016/10/07/architecture-of-linear-systems-solver/
>
> Now, according to what I have coded and optimized as much as I can, a pure
> Python is not acceptable as an overhead during these checks. It would
> definitely be a noticeable slowdown if this was in place in the existing
> linalg.solve however I think this is certainly doable in the low-level
> C/FORTRAN level.
>

How much is a noticeable slowdown? Note that we still have the current
interfaces available for users that know what they need, so a nice
convenience function that is say 5-10% slower would not be the end of the
world.

Ralf



> CPython is certainly faster but I think only a straight C/FORTRAN
> implementation would cut it. Note that we only need the discovery of the
> structure then we can pass to the dedicated solver as is. Hence I'm not
> saying that we should recode the existing solve functionality. We already
> have the branching in place to ?GE/SY/HE/POSVX routines.
>
> ---
>
> The second issue about the current linalg.solve function is when trying to
> solve for right inverse e.g. xA = b. Again with some copy/paste: The right
> inversion is currently a bit annoying, that is to say if we would like to
> compute, say, BA^{-1}, then the user has to explicitly transpose the
> explicitly transposed equation to avoid using an explicit inv(whose use
> should be discouraged anyways)
> x = scipy.linalg.solve(A.T, B.T).T.
>
> Since expert drivers come with a trans switch that can internally handle
> whether to solve the transposed or the regular 

Re: [Numpy-discussion] Deprecating matrices.

2017-01-07 Thread Ralf Gommers
On Sun, Jan 8, 2017 at 2:09 PM, Charles R Harris 
wrote:

>
>
> On Sat, Jan 7, 2017 at 5:31 PM, CJ Carey 
> wrote:
>
>> I agree with Ralf; coupling these changes to sparse is a bad idea.
>>
>> I think that scipy.sparse will be an important consideration during the
>> deprecation process, though, perhaps as an indicator of how painful the
>> transition might be for third party code.
>>
>> I'm +1 for splitting matrices out into a standalone package.
>>
>
> Decoupled or not, sparse still needs to be dealt with. What is the plan?
>

My view would be:
- keep current sparse matrices as is (with improvements, like
__numpy_func__ and the various performance improvements that regularly get
done)
- once one of the sparse *array* implementations progresses far enough,
merge that and encourage people to switch over
- in the far future, once packages like scikit-learn have switched to the
new sparse arrays, the sparse matrices could potentially also be split off
as a separate package, in the same way as we did for weave and now can do
for npmatrix.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-07 Thread Ralf Gommers
On Sun, Jan 8, 2017 at 12:42 PM, Charles R Harris <charlesr.har...@gmail.com
> wrote:

>
>
> On Sat, Jan 7, 2017 at 4:35 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
>
>>
>>
>> On Sun, Jan 8, 2017 at 12:26 PM, Charles R Harris <
>> charlesr.har...@gmail.com> wrote:
>>
>>>
>>>
>>> On Sat, Jan 7, 2017 at 2:29 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>>> wrote:
>>>
>>>>
>>>> It looks to me like we're getting a bit off track here. The sparse
>>>> matrices in scipy are heavily used, and despite rough edges pretty good at
>>>> what they do. Deprecating them is not a goal.
>>>>
>>>> The actual goal for the exercise that started this thread (at least as
>>>> I see it) is to remove np.matrix from numpy itself so users (that don't
>>>> know the difference) will only use ndarrays. And the few users that prefer
>>>> np.matrix for teaching can now switch because of @, so their preference
>>>> should have disappeared.
>>>>
>>>> To reach that goal, no deprecation or backwards incompatible changes to
>>>> scipy.sparse are needed.
>>>>
>>>
>>> What is the way forward with sparse? That looks like the biggest blocker
>>> on the road to a matrix free NumPy. I don't see moving the matrix package
>>> elsewhere as a solution for that.
>>>
>>
>> Why not?
>>
>>
> Because it doesn't get rid of matrices in SciPy, not does one gain a
> scalar multiplication operator for sparse.
>

That's a different goal though. You can reach the "get matrix out of numpy"
goal fairly easily (docs and packaging work), but if you insist on coupling
it to major changes to scipy.sparse (a lot more work + backwards compat
break), then what will likely happen is: nothing.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-07 Thread Ralf Gommers
On Sun, Jan 8, 2017 at 12:26 PM, Charles R Harris <charlesr.har...@gmail.com
> wrote:

>
>
> On Sat, Jan 7, 2017 at 2:29 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
>
>>
>> It looks to me like we're getting a bit off track here. The sparse
>> matrices in scipy are heavily used, and despite rough edges pretty good at
>> what they do. Deprecating them is not a goal.
>>
>> The actual goal for the exercise that started this thread (at least as I
>> see it) is to remove np.matrix from numpy itself so users (that don't know
>> the difference) will only use ndarrays. And the few users that prefer
>> np.matrix for teaching can now switch because of @, so their preference
>> should have disappeared.
>>
>> To reach that goal, no deprecation or backwards incompatible changes to
>> scipy.sparse are needed.
>>
>
> What is the way forward with sparse? That looks like the biggest blocker
> on the road to a matrix free NumPy. I don't see moving the matrix package
> elsewhere as a solution for that.
>

Why not?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-07 Thread Ralf Gommers
On Sun, Jan 8, 2017 at 9:31 AM, Todd <toddr...@gmail.com> wrote:

>
>
> On Jan 6, 2017 20:28, "Ralf Gommers" <ralf.gomm...@gmail.com> wrote:
>
>
>
> On Sat, Jan 7, 2017 at 2:21 PM, CJ Carey <perimosocord...@gmail.com>
> wrote:
>
>>
>> On Fri, Jan 6, 2017 at 6:19 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>> wrote:
>>
>>> This sounds like a reasonable idea. Timeline could be something like:
>>>
>>> 1. Now: create new package, deprecate np.matrix in docs.
>>> 2. In say 1.5 years: start issuing visible deprecation warnings in numpy
>>> 3. After 2020: remove matrix from numpy.
>>>
>>> Ralf
>>>
>>
>> I think this sounds reasonable, and reminds me of the deliberate
>> deprecation process taken for scipy.weave. I guess we'll see how successful
>> it was when 0.19 is released.
>>
>> The major problem I have with removing numpy matrices is the effect on
>> scipy.sparse, which mostly-consistently mimics numpy.matrix semantics and
>> often produces numpy.matrix results when densifying. The two are coupled
>> tightly enough that if numpy matrices go away, all of the existing sparse
>> matrix classes will have to go at the same time.
>>
>> I don't think that would be the end of the world,
>>
>
> Not the end of the world literally, but the impact would be pretty major.
> I think we're stuck with scipy.sparse, and may at some point will add a new
> sparse *array* implementation next to it. For scipy we will have to add a
> dependency on the new npmatrix package or vendor it.
>
> Ralf
>
>
>
>> but it's definitely something that should happen while scipy is still
>> pre-1.0, if it's ever going to happen.
>>
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@scipy.org
>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>>
>>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
> So what about this:
>
> 1. Create a sparse array class
> 2. (optional) Refactor the sparse matrix class to be based on the sparse
> array class (may not be feasible)
> 3. Copy the spare matrix class into the matrix package
> 4. Deprecate the scipy sparse matrix class
> 5. Remove the scipy sparse matrix class when the numpy matrix class
>

It looks to me like we're getting a bit off track here. The sparse matrices
in scipy are heavily used, and despite rough edges pretty good at what they
do. Deprecating them is not a goal.

The actual goal for the exercise that started this thread (at least as I
see it) is to remove np.matrix from numpy itself so users (that don't know
the difference) will only use ndarrays. And the few users that prefer
np.matrix for teaching can now switch because of @, so their preference
should have disappeared.

To reach that goal, no deprecation or backwards incompatible changes to
scipy.sparse are needed.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-07 Thread Ralf Gommers
On Sun, Jan 8, 2017 at 8:33 AM, Marten van Kerkwijk <
m.h.vankerkw...@gmail.com> wrote:

> Hi All,
>
> It seems there are two steps that can be taken now and are needed no
> matter what:
>
> 1. Add numpy documentation describing the preferred way to handle
> matrices, extolling the virtues of @, and move np.matrix documentation
> to a deprecated section
>

That would be good to do asap. Any volunteers?


>
> 2. Start on a new `sparse` class that is based on regular arrays


There are two efforts that I know of in this direction:
https://github.com/perimosocordiae/sparray
https://github.com/ev-br/sparr

(and
> uses `__array_func__` instead of prepare/wrap?).
>

Getting __array_func__ finally into a released version of numpy will be a
major improvement for sparse matrix behavior (like making
np.dot(some_matrix) work) in itself.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-07 Thread Ralf Gommers
On Sat, Jan 7, 2017 at 9:39 PM, Nathaniel Smith <n...@pobox.com> wrote:

> On Fri, Jan 6, 2017 at 11:59 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >
> >
> > On Sat, Jan 7, 2017 at 2:52 PM, Charles R Harris <
> charlesr.har...@gmail.com>
> > wrote:
> >>
> >>
> >>
> >> On Fri, Jan 6, 2017 at 6:37 PM, <josef.p...@gmail.com> wrote:
> >>>
> >>>
> >>>
> >>>
> >>> On Fri, Jan 6, 2017 at 8:28 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> >>> wrote:
> >>>>
> >>>>
> >>>>
> >>>> On Sat, Jan 7, 2017 at 2:21 PM, CJ Carey <perimosocord...@gmail.com>
> >>>> wrote:
> >>>>>
> >>>>>
> >>>>> On Fri, Jan 6, 2017 at 6:19 PM, Ralf Gommers <ralf.gomm...@gmail.com
> >
> >>>>> wrote:
> >>>>>>
> >>>>>> This sounds like a reasonable idea. Timeline could be something
> like:
> >>>>>>
> >>>>>> 1. Now: create new package, deprecate np.matrix in docs.
> >>>>>> 2. In say 1.5 years: start issuing visible deprecation warnings in
> >>>>>> numpy
> >>>>>> 3. After 2020: remove matrix from numpy.
> >>>>>>
> >>>>>> Ralf
> >>>>>
> >>>>>
> >>>>> I think this sounds reasonable, and reminds me of the deliberate
> >>>>> deprecation process taken for scipy.weave. I guess we'll see how
> successful
> >>>>> it was when 0.19 is released.
> >>>>>
> >>>>> The major problem I have with removing numpy matrices is the effect
> on
> >>>>> scipy.sparse, which mostly-consistently mimics numpy.matrix
> semantics and
> >>>>> often produces numpy.matrix results when densifying. The two are
> coupled
> >>>>> tightly enough that if numpy matrices go away, all of the existing
> sparse
> >>>>> matrix classes will have to go at the same time.
> >>>>>
> >>>>> I don't think that would be the end of the world,
> >>>>
> >>>>
> >>>> Not the end of the world literally, but the impact would be pretty
> >>>> major. I think we're stuck with scipy.sparse, and may at some point
> will add
> >>>> a new sparse *array* implementation next to it. For scipy we will
> have to
> >>>> add a dependency on the new npmatrix package or vendor it.
> >>>
> >>>
> >>> That sounds to me like moving maintenance of numpy.matrix from numpy to
> >>> scipy, if scipy.sparse is one of the main users and still depends on
> it.
> >
> >
> > Maintenance costs are pretty low, and are partly still for numpy (it has
> to
> > keep subclasses like np.matrix working. I'm not too worried about the
> > effort. The purpose here is to remove np.matrix from numpy so beginners
> will
> > never see it. Educating sparse matrix users is a lot easier, and there
> are a
> > lot less such users.
> >
> >>
> >> What I was thinking was encouraging folks to use `arr.dot(...)` or `@`
> >> instead of `*` for matrix multiplication, keeping `*` for scalar
> >> multiplication.
> >
> >
> > I don't think that change in behavior of `*` is doable.
>
> I guess it would be technically possible to have matrix.__mul__ issue
> a deprecation warning before matrix.__init__ does, to try and
> encourage people to switch to using .dot and/or @, and thus make it
> easier to later port their code to regular arrays?


Yes, but that's not very relevant. I'm saying "not doable" since after the
debacle with changing diag return to a view my understanding is we decided
that it's a bad idea to make changes that don't break code but return
different numerical results. There's no good way to work around that here.

With something as widely used as np.matrix, you simply cannot rely on
people porting code. You just need to phase out np.matrix in a way that
breaks code but never changes behavior silently (even across multiple
releases).

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-06 Thread Ralf Gommers
On Sat, Jan 7, 2017 at 2:52 PM, Charles R Harris <charlesr.har...@gmail.com>
wrote:

>
>
> On Fri, Jan 6, 2017 at 6:37 PM, <josef.p...@gmail.com> wrote:
>
>>
>>
>>
>> On Fri, Jan 6, 2017 at 8:28 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Sat, Jan 7, 2017 at 2:21 PM, CJ Carey <perimosocord...@gmail.com>
>>> wrote:
>>>
>>>>
>>>> On Fri, Jan 6, 2017 at 6:19 PM, Ralf Gommers <ralf.gomm...@gmail.com>
>>>> wrote:
>>>>
>>>>> This sounds like a reasonable idea. Timeline could be something like:
>>>>>
>>>>> 1. Now: create new package, deprecate np.matrix in docs.
>>>>> 2. In say 1.5 years: start issuing visible deprecation warnings in
>>>>> numpy
>>>>> 3. After 2020: remove matrix from numpy.
>>>>>
>>>>> Ralf
>>>>>
>>>>
>>>> I think this sounds reasonable, and reminds me of the deliberate
>>>> deprecation process taken for scipy.weave. I guess we'll see how successful
>>>> it was when 0.19 is released.
>>>>
>>>> The major problem I have with removing numpy matrices is the effect on
>>>> scipy.sparse, which mostly-consistently mimics numpy.matrix semantics and
>>>> often produces numpy.matrix results when densifying. The two are coupled
>>>> tightly enough that if numpy matrices go away, all of the existing sparse
>>>> matrix classes will have to go at the same time.
>>>>
>>>> I don't think that would be the end of the world,
>>>>
>>>
>>> Not the end of the world literally, but the impact would be pretty
>>> major. I think we're stuck with scipy.sparse, and may at some point will
>>> add a new sparse *array* implementation next to it. For scipy we will have
>>> to add a dependency on the new npmatrix package or vendor it.
>>>
>>
>> That sounds to me like moving maintenance of numpy.matrix from numpy to
>> scipy, if scipy.sparse is one of the main users and still depends on it.
>>
>
Maintenance costs are pretty low, and are partly still for numpy (it has to
keep subclasses like np.matrix working. I'm not too worried about the
effort. The purpose here is to remove np.matrix from numpy so beginners
will never see it. Educating sparse matrix users is a lot easier, and there
are a lot less such users.


> What I was thinking was encouraging folks to use `arr.dot(...)` or `@`
> instead of `*` for matrix multiplication, keeping `*` for scalar
> multiplication.
>

I don't think that change in behavior of `*` is doable.


> If those operations were defined for matrices,
>

Why if? They are defined, and work as expected as far as I can tell.


> then at some point sparse could go to arrays and it would not be
> noticeable except for the treatment of 1-D arrays -- which admittedly might
> be a bit tricky.
>

I'd like that to be feasible, but especially given that any such change
would not break code but rather silently change numerical values, it's
likely not a healthy idea.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-06 Thread Ralf Gommers
On Sat, Jan 7, 2017 at 2:21 PM, CJ Carey <perimosocord...@gmail.com> wrote:

>
> On Fri, Jan 6, 2017 at 6:19 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
>
>> This sounds like a reasonable idea. Timeline could be something like:
>>
>> 1. Now: create new package, deprecate np.matrix in docs.
>> 2. In say 1.5 years: start issuing visible deprecation warnings in numpy
>> 3. After 2020: remove matrix from numpy.
>>
>> Ralf
>>
>
> I think this sounds reasonable, and reminds me of the deliberate
> deprecation process taken for scipy.weave. I guess we'll see how successful
> it was when 0.19 is released.
>
> The major problem I have with removing numpy matrices is the effect on
> scipy.sparse, which mostly-consistently mimics numpy.matrix semantics and
> often produces numpy.matrix results when densifying. The two are coupled
> tightly enough that if numpy matrices go away, all of the existing sparse
> matrix classes will have to go at the same time.
>
> I don't think that would be the end of the world,
>

Not the end of the world literally, but the impact would be pretty major. I
think we're stuck with scipy.sparse, and may at some point will add a new
sparse *array* implementation next to it. For scipy we will have to add a
dependency on the new npmatrix package or vendor it.

Ralf



> but it's definitely something that should happen while scipy is still
> pre-1.0, if it's ever going to happen.
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-06 Thread Ralf Gommers
On Wed, Jan 4, 2017 at 9:07 AM, Bryan Van de Ven 
wrote:

> There's a good chance that bokeh.charts will be split off into a
> separately distributed package as well. Hopefully being a much smaller,
> pure Python project makes it a more accessible target for anyone interested
> in maintaining it, and if no one is interested in it anymore, well that
> fact becomes easier to judge. I think it would be a reasonable approach
> here for the same reasons.
>
> Bryan
>
> On Jan 3, 2017, at 13:54, Benjamin Root  wrote:
>
> That's not a bad idea. Matplotlib is currently considering something
> similar for its mlab module. It has been there since the beginning, but it
> is very outdated and very out-of-scope for matplotlib. However, there are
> still lots of code out there that depends on it. So, we are looking to
> split it off as its own package. The details still need to be worked out
> (should we initially depend on the package and simply alias its import with
> a DeprecationWarning, or should we go cold turkey and have a good message
> explaining the change).
>
> Don't go cold turkey please, that still would break a lot of code. Even
with a good message, breaking things isn't great.


>
> Ben Root
>
>
> On Tue, Jan 3, 2017 at 2:31 PM, Todd  wrote:
>
>> On Mon, Jan 2, 2017 at 8:36 PM, Charles R Harris <
>> charlesr.har...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> Just throwing this click bait out for discussion. Now that the `@`
>>> operator is available and things seem to be moving towards Python 3,
>>> especially in the classroom, we should consider the real possibility of
>>> deprecating the matrix type and later removing it. No doubt there are old
>>> scripts that require them, but older versions of numpy are available for
>>> those who need to run old scripts.
>>>
>>> Thoughts?
>>>
>>> Chuck
>>>
>>>
>> What if the matrix class was split out into its own project, perhaps as a
>> scikit.
>>
> Something like "npmatrix" would be a better name, we'd like to keep
scikit- for active well-maintained projects I'd think.


>   That way those who really need it can still use it.  If there is
>> sufficient desire for it, those who need it can maintain it.  If not, it
>> will hopefully it will take long enough for it to bitrot that everyone has
>> transitioned.
>>
>
This sounds like a reasonable idea. Timeline could be something like:

1. Now: create new package, deprecate np.matrix in docs.
2. In say 1.5 years: start issuing visible deprecation warnings in numpy
3. After 2020: remove matrix from numpy.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating matrices.

2017-01-02 Thread Ralf Gommers
On Tue, Jan 3, 2017 at 2:36 PM, Charles R Harris 
wrote:

> Hi All,
>
> Just throwing this click bait out for discussion. Now that the `@`
> operator is available and things seem to be moving towards Python 3,
> especially in the classroom, we should consider the real possibility of
> deprecating the matrix type and later removing it. No doubt there are old
> scripts that require them, but older versions of numpy are available for
> those who need to run old scripts.
>
> Thoughts?
>

Clearly deprecate in the docs now, and warn only later imho. We can't warn
before we have a good solution for scipy.sparse matrices, which have matrix
semantics and return matrix instances.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] script for building numpy from source in virtualenv and outside it

2016-12-26 Thread Ralf Gommers
On Tue, Dec 27, 2016 at 9:43 AM, Felipe Vieira  wrote:

> Dear fellows,
>
> I'm struggling with a single script to build numpy from source in a
> virtual env. I want the same script to be able to be run with a normal env.
>
> So if ran from a normal env it should affect all users.
> If ran within a virtual env the installation should be constrained to that
> env.
>
> I tried setting script variables and other tricks but the script is always
> executed as a 'out of virtual env' user (I cannot make it aware that is
> running from a virtualenv), thus affecting my real working python. As the
> script activates other scripts I am not posting them for now (hoping that
> this is a simple issue).
>
> tl;dr: How can I install numpy from source and build it in a script which
> uses the virtual env instead of affecting the whole system?
>
> (And yes, I have looked for solutions on google but none of them worked.)
>

Sounds like you just need to run your script with the Python interpreter in
the virtualenv. There's nothing numpy-specific about this.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PyPy wheels (was: NumPy 1.12.0b1 released)

2016-11-18 Thread Ralf Gommers
On Sat, Nov 19, 2016 at 2:53 PM, Nathaniel Smith <n...@pobox.com> wrote:

> On Nov 18, 2016 3:30 PM, "Ralf Gommers" <ralf.gomm...@gmail.com> wrote:
> >
> >
> >
> > On Sat, Nov 19, 2016 at 5:24 AM, Nathaniel Smith <n...@pobox.com> wrote:
> >>
> >> Another thing to think about is that 1.12 on pypy won't pass its test
> suite (though it's close), and we're not yet testing new PRs on pypy, so no
> guarantees about 1.13 yet. I think on balance these probably aren't reasons
> *not* to upload wheels, but it's a funny place where we're talking about
> providing "official" builds even though it's not an "officially supported
> platform". So we will at least want to be clear about that. And someone
> will have to handle the bug reports about the test suite failing :-).
> >
> >
> > Those are good points. We could run PyPy on TravisCI; the PyPy install
> and numpy build aren't difficult anymore.
>
> I'm not sure how useful this until the test suite is passing, though, just
> because of how travis-ci works.
>
It's not hard to skip the currently failing tests only on a single run in
the build matrix. That would keep TravisCI green and ensure there's no
regressions.

Ralf



> The main outstanding issue needs some work on the numpy side: basically
> UPDATEIFCOPY, as currently conceived, just can't work reliably on pypy,
> because it assumes that Python objects get deterministically deallocated as
> soon as their reference count drops to zero, and pypy doesn't have
> reference counts.
>
> I think fixing this should be straightforward enough, in case anyone wants
> to work on it. Every user of UPDATEIFCOPY already has to be aware of the
> reference counts and know when the pseudo-"view" array is supposed to be
> deallocated. So I think we should define a new UPDATEIFCOPY2 flag, which
> acts like the current UPDATEIFCOPY, except that instead of using __del__ to
> do the writeback, there's an explicit API call you have to make, like
> PyArray_UpdateIfCopy2_Writeback, that checks for the flag and does the
> writeback if set. Then we should transition to using this internally, and
> probably deprecate UPDATEIFCOPY (though we may never be able to get rid of
> it entirely).
>
> -n
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] PyPy wheels (was: NumPy 1.12.0b1 released)

2016-11-18 Thread Ralf Gommers
On Sat, Nov 19, 2016 at 5:24 AM, Nathaniel Smith <n...@pobox.com> wrote:

> Another thing to think about is that 1.12 on pypy won't pass its test
> suite (though it's close), and we're not yet testing new PRs on pypy, so no
> guarantees about 1.13 yet. I think on balance these probably aren't reasons
> *not* to upload wheels, but it's a funny place where we're talking about
> providing "official" builds even though it's not an "officially supported
> platform". So we will at least want to be clear about that. And someone
> will have to handle the bug reports about the test suite failing :-).
>

Those are good points. We could run PyPy on TravisCI; the PyPy install and
numpy build aren't difficult anymore.

Handling bug reports is mostly checking if it's PyPy specific, and if so
refer to the PyPy tracker I'd think. It is some work, but given that PyPy
has finally chosen a way to support Numpy that's not a dead end and has
come quite a long way quite quickly, taking on that bit of extra work as
Numpy maintainers is a good time investment imho.

Many bug reports will go straight to PyPy though I expect, because often
that is the obvious place to go. This is what I just got from downloading
the OS X PyPy binary and pip installing numpy master:

Python version 2.7.12 (aff251e54385, Nov 09 2016, 17:25:49)[PyPy 5.6.0 with
GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)]
nose version 1.3.7
.S...ES.RPython
traceback:
  File "pypy_interpreter.c", line 43348, in
BuiltinCodePassThroughArguments1_funcrun_obj
  File "pypy_module_cpyext_4.c", line 16627, in
generic_cpy_call__StdObjSpaceConst_funcPtr_SomeI_17
Fatal RPython error: AssertionError
Abort trap: 6

So guess I'll go find out where that issue tracker is:)

Ralf


On Nov 18, 2016 01:14, "Ralf Gommers" <ralf.gomm...@gmail.com> wrote:
> >
> >
> >
> > On Fri, Nov 18, 2016 at 9:08 PM, Matthew Brett <matthew.br...@gmail.com>
> wrote:
> >>
> >> Hi,
> >>
> >> On Thu, Nov 17, 2016 at 3:24 PM, Matti Picus <matti.pi...@gmail.com>
> wrote:
> >> > Congrats to all on the release.Two questions:
> >> >
> >> > Is there a guide to building standard wheels for NumPy?
> >>
> >> I don't think so - there is a repository that we use to build the
> >> wheels, that has the Windows, OSX and manyllinux recipes for the
> >> standard CPython build:
> >>
> >> https://github.com/MacPython/numpy-wheelso
> >>
> >> If you can work out a way to automate the PyPy builds and tests -
> >> especially using the same repo - that would be very useful.
> >>
> >> > Assuming I can build standardized PyPy 2.7 wheels for Ubuntu, Win32
> and
> >> > OSX64, how can I get them blessed and uploaded to PyPI?
> >>
> >> If you can automate the build and tests, I'm guessing there will be no
> >> objections - but it's not my call...
> >
> >
> > I'm in favor, assuming that the wheel tags and PyPy backwards
> compatibility situation is OK. Can't really find any examples. What I mean
> is that for CPython wheels contain tags like "cp27" or "cp35". PyPy wheels
> should have tags "pp". Are the PyPy cpyext layer and the
>  defined such that a new PyPy release won't break older wheels?
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy 1.12.0b1 released

2016-11-18 Thread Ralf Gommers
On Fri, Nov 18, 2016 at 9:08 PM, Matthew Brett 
wrote:

> Hi,
>
> On Thu, Nov 17, 2016 at 3:24 PM, Matti Picus 
> wrote:
> > Congrats to all on the release.Two questions:
> >
> > Is there a guide to building standard wheels for NumPy?
>
> I don't think so - there is a repository that we use to build the
> wheels, that has the Windows, OSX and manyllinux recipes for the
> standard CPython build:
>
> https://github.com/MacPython/numpy-wheelso
>
> If you can work out a way to automate the PyPy builds and tests -
> especially using the same repo - that would be very useful.
>
> > Assuming I can build standardized PyPy 2.7 wheels for Ubuntu, Win32 and
> > OSX64, how can I get them blessed and uploaded to PyPI?
>
> If you can automate the build and tests, I'm guessing there will be no
> objections - but it's not my call...
>

I'm in favor, assuming that the wheel tags and PyPy backwards compatibility
situation is OK. Can't really find any examples. What I mean is that for
CPython wheels contain tags like "cp27" or "cp35". PyPy wheels should have
tags "pp". Are the PyPy cpyext layer and the  defined
such that a new PyPy release won't break older wheels?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Status of NAs for integer arrays

2016-11-18 Thread Ralf Gommers
On Fri, Nov 18, 2016 at 8:43 PM, Dmitrii Izgurskii 
wrote:

> Hi,
>
> I would like to know where I can follow the progress of what's being done
> on implementing missing values in integer arrays. Some threads I found are
> from 2012. Is really nothing being done on this issue?
>

Not that I'm aware of, no.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] __numpy_ufunc__

2016-11-07 Thread Ralf Gommers
On Mon, Nov 7, 2016 at 9:21 AM, Nathan Goldbaum 
wrote:

> I'm interested in this, but was never able to dive in to the lengthy
> discussions on this. I'm curious if there is a summary of the current
> state of things somewhere that I could read to catch up.


Old proposal, original implementation:
https://github.com/numpy/numpy/blob/master/doc/neps/ufunc-overrides.rst

Main issue for discussion: https://github.com/numpy/numpy/issues/5844. That
really needs a summary, but I'm not sure there's an up-to-date one.

Ralf



> On Sunday, November 6, 2016, Charles R Harris 
> wrote:
>
>>
>>
>> On Sun, Nov 6, 2016 at 11:44 AM, Charles R Harris <
>> charlesr.har...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> For those interested in continuing the __numpy_ufunc__ saga, there is a pull
>>> request enabling it . Likely
>>> we will want to make some changes up front before merging that, so some
>>> discussion is in order.
>>>
>>>
>> As a first order of business, let's decide whether to remove the index
>> and rename `__numpy_ufunc__`. The motivation for this is discussed in issue
>> #5986. 
>> If we decide positive on that (I'm in favor), I would be happy with the
>> proposed name `__array_ufunc__`, although something more descriptive like
>> `__handle_ufunc__` might be better. This is a wonderful opportunity for
>> bike shedding for those so inclined ;)
>>
>> Chuck
>>
>>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] __numpy_ufunc__

2016-11-07 Thread Ralf Gommers
On Mon, Nov 7, 2016 at 9:10 AM, Charles R Harris 
wrote:

>
>
> On Sun, Nov 6, 2016 at 11:44 AM, Charles R Harris <
> charlesr.har...@gmail.com> wrote:
>
>> Hi All,
>>
>> For those interested in continuing the __numpy_ufunc__ saga, there is a pull
>> request enabling it . Likely
>> we will want to make some changes up front before merging that, so some
>> discussion is in order.
>>
>>
> As a first order of business, let's decide whether to remove the index and
> rename `__numpy_ufunc__`. The motivation for this is discussed in issue
> #5986. 
> If we decide positive on that (I'm in favor),
>

It seems like everyone on that issue is in favor or at least +0. So +1 from
me too.


> I would be happy with the proposed name `__array_ufunc__`, although
> something more descriptive like `__handle_ufunc__` might be better.
>



> This is a wonderful opportunity for bike shedding for those so inclined ;)
>

pass :)

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Branching NumPy 1.12.x

2016-11-04 Thread Ralf Gommers
On Fri, Nov 4, 2016 at 6:04 AM, Charles R Harris 
wrote:

> Hi All,
>
> I'm thinking that it is time to branch NumPy 1.12.x. I haven't got
> everything in it that I would have liked, in particular __numpy_ufunc__,
> but I think there is plenty of material and not branching is holding up
> some of the more risky stuff.  My current thoughts on __numpy_ufunc__ is
> that it would be best to work it out over the 1.13.0 release cycle,
> starting with enabling it again right after the branch. Julian's work on
> avoiding temporary copies and Pauli's overlap handling PR are two other
> changes I've been putting off but don't want to delay further. There are
> some other smaller things that I had scheduled for 1.12.0, but would like
> to spend more time looking at. If there are some things that you think just
> have to be in 1.12.0, please mention them, but I'd rather aim at getting
> 1.13.0 out in a timely manner.
>
> Thoughts?
>

That's a really good plan I think.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Intel random number package

2016-10-27 Thread Ralf Gommers
On Thu, Oct 27, 2016 at 10:25 AM, Pavlyk, Oleksandr <
oleksandr.pav...@intel.com> wrote:

> Please see responses inline.
>
>
>
> *From:* NumPy-Discussion [mailto:numpy-discussion-boun...@scipy.org] *On
> Behalf Of *Todd
> *Sent:* Wednesday, October 26, 2016 4:04 PM
> *To:* Discussion of Numerical Python 
> *Subject:* Re: [Numpy-discussion] Intel random number package
>
>
>
> On Wed, Oct 26, 2016 at 4:30 PM, Pavlyk, Oleksandr <
> oleksandr.pav...@intel.com> wrote:
>
> Another point already raised by Nathaniel is that for numpy's randomness
> ideally should provide a way to override default algorithm for sampling
> from a particular distribution.  For example RandomState object that
> implements PCG may rely on default acceptance-rejection algorithm for
> sampling from Gamma, while the RandomState object that provides interface
> to MKL might want to call into MKL directly.
>
>
>
> The approach that pyfftw uses at least for scipy, which may also work
> here, is that you can monkey-patch the scipy.fftpack module at runtime,
> replacing it with pyfftw's drop-in replacement.  scipy then proceeds to use
> pyfftw instead of its built-in fftpack implementation.  Might such an
> approach work here?  Users can either use this alternative randomstate
> replacement directly, or they can replace numpy's with it at runtime and
> numpy will then proceed to use the alternative.
>

The only reason that pyfftw uses monkeypatching is that the better approach
is not possible due to license constraints with FFTW (it's GPL).


> I think the monkey-patching approach will work.
>

It will work, for a while at least, but it's bad design.

We're all on the same page I think that a separate submodule for
random_intel is a no go, but as an explicitly switchable backend for
functions with the same signature it would be fine imho. Of course we don't
have that backend infrastructure today, but it's something we want and have
been discussing anyway.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Intel random number package

2016-10-26 Thread Ralf Gommers
On Wed, Oct 26, 2016 at 8:33 PM, Julian Taylor <
jtaylor.deb...@googlemail.com> wrote:

> On 26.10.2016 06:34, Charles R Harris wrote:
> > Hi All,
> >
> > There is a proposed random number package PR now up on github:
> > https://github.com/numpy/numpy/pull/8209. It is from
> > oleksandr-pavlyk  and implements
> > the number random number package using MKL for increased speed. I think
> > we are definitely interested in the improved speed, but I'm not sure
> > numpy is the best place to put the package. I'd welcome any comments on
> > the PR itself, as well as any thoughts on the best way organize or use
> > of this work. Maybe scikit-random
>

Note that this thread is a continuation of
https://mail.scipy.org/pipermail/numpy-discussion/2016-July/075822.html


>
> I'm not a fan of putting code depending on a proprietary library into
> numpy.
> This should be a standalone package which may provide the same interface
> as numpy.
>

I don't really see a problem with that in principle. Numpy can use Intel
MKL (and Accelerate) as well if it's available. It needs some thought put
into the API though - a ``numpy.random_intel`` module is certainly not what
we want.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Scipy installation on Window with mingw32

2016-10-18 Thread Ralf Gommers
Hi,

A few comments:
- you really really want to use a scientific Python distribution to avoid
these issues on Windows. see http://scipy.org/install.html
- we used to build scipy .exe installers with mingw32 but don't do that
anymore because it's just too much of a pain. IIRC the last release we did
that for was 0.16.0, with the toolchain in
https://github.com/numpy/numpy-vendor.
- I don't recognize the error; looks not specific to recent changes in
scipy so there's probably something in your environment not set up quite
right.

Cheers,
Ralf


On Tue, Oct 18, 2016 at 8:18 PM, Shreyank Amartya <
shreyank.amar...@itcinfotech.com> wrote:

> Hi,
>
>
>
> I am trying install to theano which also requires numpy and scipy on
> windows 7 with mingw32 compilers.
>
> I have successfully installed numpy using mingw32 but however when trying
> to install scipy I get this error:
>
>
>
> Looking for python27.dll
>
> Building msvcr library: "c:\python27\libs\libmsvcr90.a" (from
> C:\Windows\win
>
> sxs\amd64_microsoft.vc90.crt_1fc8b3b9a1e18e3b_9.0.21022.8_
> none_750b37ff97f4f68b\
>
> msvcr90.dll)
>
> objdump.exe: C:\Windows\winsxs\amd64_microsoft.vc90.crt_
> 1fc8b3b9a1e18e3b_9.0
>
> .21022.8_none_750b37ff97f4f68b\msvcr90.dll: File format not recognized
>
> Traceback (most recent call last):
>
>   File "", line 1, in 
>
>   File "c:\users\22193\appdata\local\temp\pip-build-d3f_pb\scipy\
> setup.py",
>
> line 415, in 
>
> setup_package()
>
>   File "c:\users\22193\appdata\local\temp\pip-build-d3f_pb\scipy\
> setup.py",
>
> line 411, in setup_package
>
> setup(**metadata)
>
>   File "c:\python27\lib\site-packages\numpy\distutils\core.py", line
> 169, in
>
> setup
>
> return old_setup(**new_attr)
>
>   File "c:\python27\lib\distutils\core.py", line 151, in setup
>
> dist.run_commands()
>
>   File "c:\python27\lib\distutils\dist.py", line 953, in run_commands
>
> self.run_command(cmd)
>
>   File "c:\python27\lib\distutils\dist.py", line 972, in run_command
>
> cmd_obj.run()
>
>   File "c:\python27\lib\site-packages\numpy\distutils\command\install.py",
> l
>
> ine 62, in run
>
> r = self.setuptools_run()
>
>   File "c:\python27\lib\site-packages\numpy\distutils\command\install.py",
> l
>
> ine 36, in setuptools_run
>
> return distutils_install.run(self)
>
>   File "c:\python27\lib\distutils\command\install.py", line 563, in
> run
>
> self.run_command('build')
>
>   File "c:\python27\lib\distutils\cmd.py", line 326, in run_command
>
> self.distribution.run_command(command)
>
>   File "c:\python27\lib\distutils\dist.py", line 972, in run_command
>
> cmd_obj.run()
>
>  File "c:\python27\lib\site-packages\numpy\distutils\command\build.py",
> lin
>
> e 47, in run
>
> old_build.run(self)
>
>   File "c:\python27\lib\distutils\command\build.py", line 127, in run
>
> self.run_command(cmd_name)
>
>   File "c:\python27\lib\distutils\cmd.py", line 326, in run_command
>
> self.distribution.run_command(command)
>
>   File "c:\python27\lib\distutils\dist.py", line 972, in run_command
>
> cmd_obj.run()
>
>   File "c:\python27\lib\site-packages\numpy\distutils\
> command\build_src.py",
>
> line 147, in run
>
> self.build_sources()
>
>   File "c:\python27\lib\site-packages\numpy\distutils\
> command\build_src.py",
>
> line 164, in build_sources
>
> self.build_extension_sources(ext)
>
>   File "c:\python27\lib\site-packages\numpy\distutils\
> command\build_src.py",
>
> line 323, in build_extension_sources
>
> sources = self.generate_sources(sources, ext)
>
>   File "c:\python27\lib\site-packages\numpy\distutils\
> command\build_src.py",
>
> line 376, in generate_sources
>
> source = func(extension, build_dir)
>
>   File "scipy\spatial\setup.py", line 35, in get_qhull_misc_config
>
> if config_cmd.check_func('open_memstream', decl=True, call=True):
>
>   File "c:\python27\lib\site-packages\numpy\distutils\command\config.py",
> li
>
> ne 312, in check_func
>
> self._check_compiler()
>
>   File "c:\python27\lib\site-packages\numpy\distutils\command\config.py",
> li
>
> ne 39, in _check_compiler
>
> old_config._check_compiler(self)
>
>   File "c:\python27\lib\distutils\command\config.py", line 102, in
> _check_co
>
> mpiler
>
> dry_run=self.dry_run, force=1)
>
>   File "c:\python27\lib\site-packages\numpy\distutils\ccompiler.py",
> line 59
>
> 6, in new_compiler
>
> compiler = klass(None, dry_run, force)
>
>   File "c:\python27\lib\site-packages\numpy\distutils\
> mingw32ccompiler.py",
>
> line 96, in __init__
>
> msvcr_success = build_msvcr_library()
>
>   File "c:\python27\lib\site-packages\numpy\distutils\
> mingw32ccompiler.py",
>
> line 360, in build_msvcr_library
>
> generate_def(dll_file, def_file)
>
>   File 

Re: [Numpy-discussion] Integers to negative integer powers, time for a decision.

2016-10-09 Thread Ralf Gommers
On Sun, Oct 9, 2016 at 4:12 AM, Nathaniel Smith  wrote:

> On Sat, Oct 8, 2016 at 6:59 AM, Charles R Harris
>  wrote:
> >
> >
> > On Sat, Oct 8, 2016 at 4:40 AM, Nathaniel Smith  wrote:
> >>
> >> On Fri, Oct 7, 2016 at 6:12 PM, Charles R Harris
> >>  wrote:
> >> > Hi All,
> >> >
> >> > The time for NumPy 1.12.0 approaches and I like to have a final
> decision
> >> > on
> >> > the treatment of integers to negative integer powers with the `**`
> >> > operator.
> >> > The two alternatives looked to be
> >> >
> >> > Raise an error for arrays and numpy scalars, including 1 and -1 to
> >> > negative
> >> > powers.
> >> >
> >> > Pluses
> >> >
> >> > Backward compatible
> >> > Allows common powers to be integer, e.g., arange(3)**2
> >> > Consistent with inplace operators
> >> > Fixes current wrong behavior.
> >> > Preserves type
> >> >
> >> >
> >> > Minuses
> >> >
> >> > Integer overflow
> >> > Computational inconvenience
> >> > Inconsistent with Python integers
> >> >
> >> >
> >> > Always return a float
> >> >
> >> > Pluses
> >> >
> >> > Computational convenience
> >> >
> >> >
> >> > Minuses
> >> >
> >> > Loss of type
> >> > Possible backward incompatibilities
> >> > Not applicable to inplace operators
> >>
> >> I guess I could be wrong, but I think the backwards incompatibilities
> >> are going to be *way* too severe to make option 2 possible in
> >> practice.
> >>
> >
> > Backwards compatibility is also a major concern for me.  Here are my
> current
> > thoughts
> >
> > Add an fpow ufunc that always converts to float, it would not accept
> object
> > arrays.
>
> Maybe call it `fpower` or even `float_power`, for consistency with `power`?
>
> > Raise errors in current power ufunc (**), for ints to negative ints.
> >
> > The power ufunc will change in the following ways
> >
> > +1, -1 to negative ints will error, currently they work
> > n > 1 ints to negative ints will error, currently warn and return zero
> > 0 to negative ints will error, they currently return the minimum integer
> >
> > The `**` operator currently calls the power ufunc, leave that as is for
> > backward almost compatibility. The remaining question is numpy scalars,
> > which we can make either compatible with Python, or with NumPy arrays.
> I'm
> > leaning towards NumPy array compatibility mostly on account of type
> > preservation and the close relationship between zero dimensionaly arrays
> and
> > scalars.
>
> Sounds good to me. I agree that we should prioritize within-numpy
> consistency over consistency with Python.
>

+1 sounds good to me too.


>
> > The fpow function could be backported to NumPy 1.11 if that would be
> helpful
> > going forward.
>
> I'm not a big fan of this kind of backport. Violating the
> "bug-fixes-only" rule makes it hard for people to understand our
> release versions. And it creates the situation where people can write
> code that they think requires numpy 1.11 (because it works with their
> numpy 1.11!), but then breaks on other people's computers (because
> those users have 1.11.(x-1)). And if there's some reason why people
> aren't willing to upgrade to 1.12 for new features, then probably
> better to spend energy addressing those instead of on putting together
> 1.11-and-a-half releases.
>

Agreed, this is not something we want to backport.

Ralf



>
> -n
>
> --
> Nathaniel J. Smith -- https://vorpus.org
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] update on mailing list issues

2016-10-04 Thread Ralf Gommers
On Tue, Oct 4, 2016 at 11:50 PM, Neal Becker <ndbeck...@gmail.com> wrote:

> Ralf Gommers wrote:
>
> > Hi all,
> >
> > We've had a number of issues with the reliability of the mailman setup
> > that powers the mailing lists for NumPy, SciPy and several other
> projects.
> > To address that we'll start migrating to the python.org provided
> > infrastructure, which should be much more reliable.
> >
> > The full set of lists is here: https://mail.scipy.org/mailman/listinfo.
> > Looks like we have to migrate at least:
> > AstroPy
> > IPython-dev
> > IPython-user
> > NumPy-Discussion
> > SciPy-Dev
> > SciPy-User
> > SciPy-organisers
> >
> > Some of the other ones that are not clearly obsolete but have almost zero
> > activity (APUG, Nipy-devel) we'll have to contact the owners. *-tickets
> > may be useful to archive. The other ones will just be cleaned up, unless
> > someone indicates that there's a reason to keep them around.
> >
> > And a pre-emptive thanks to Didrik and Enthought for taking on the task
> of
> > migrating the archives and user details.
> >
> > Cheers,
> > Ralf
>
> Someone will need to update gmane nntp/mail gateway then, I suppose?
>

Thanks for the reminder. Yes, guess we need to do something there. Not just
yet though, this is what I got when I looked at how to edit list details on
gmane: "Not all of Gmane is back yet - We're working hard to restore
everything"

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] update on mailing list issues

2016-10-04 Thread Ralf Gommers
Hi all,

We've had a number of issues with the reliability of the mailman setup that
powers the mailing lists for NumPy, SciPy and several other projects. To
address that we'll start migrating to the python.org provided
infrastructure, which should be much more reliable.

The full set of lists is here: https://mail.scipy.org/mailman/listinfo.
Looks like we have to migrate at least:
AstroPy
IPython-dev
IPython-user
NumPy-Discussion
SciPy-Dev
SciPy-User
SciPy-organisers

Some of the other ones that are not clearly obsolete but have almost zero
activity (APUG, Nipy-devel) we'll have to contact the owners. *-tickets may
be useful to archive. The other ones will just be cleaned up, unless
someone indicates that there's a reason to keep them around.

And a pre-emptive thanks to Didrik and Enthought for taking on the task of
migrating the archives and user details.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Vendorize tempita

2016-09-30 Thread Ralf Gommers
On Sat, Oct 1, 2016 at 1:42 PM, Charles R Harris 
wrote:

>
>
> On Fri, Sep 30, 2016 at 10:36 AM, Charles R Harris <
> charlesr.har...@gmail.com> wrote:
>
>>
>>
>> On Fri, Sep 30, 2016 at 10:10 AM, Charles R Harris <
>> charlesr.har...@gmail.com> wrote:
>>
>>>
>>>
>>> On Fri, Sep 30, 2016 at 9:48 AM, Evgeni Burovski <
>>> evgeny.burovs...@gmail.com> wrote:
>>>
 On Fri, Sep 30, 2016 at 6:29 PM, Charles R Harris
  wrote:
 >
 >
 > On Fri, Sep 30, 2016 at 9:21 AM, Benjamin Root 
 wrote:
 >>
 >> This is the first I am hearing of tempita (looks to be a templating
 >> language). How is it a dependency of numpy? Do I now need tempita in
 order
 >> to use numpy, or is it a build-time-only dependency?
 >
 >
 > Build time only. The virtue of tempita is that it can be used to
 generate
 > cython sources. We could adapt one of our current templating scripts
 to do
 > that also, but that would seem to be more work. Note that tempita is
 > currently included in cython, but the cython folks consider that an
 > implemention detail that should not be depended upon.
 >
 > 
 >
 > Chuck
 >
 > ___
 > NumPy-Discussion mailing list
 > NumPy-Discussion@scipy.org
 > https://mail.scipy.org/mailman/listinfo/numpy-discussion
 >


 Ideally, it's packaged in such a way that it's usable for scipy too --
 at the moment it's used in scipy.sparse via Cython.Tempita + a
 fallback to system installed tempita if Cython.Tempita is not
 available (however I'm not sure that fallback is ever exercised).
 Since scipy needs to support numpy down to 1.8.2, a vendorized copy
 will not be usable for scipy for quite a while.

 So, it'd be great to handle it like numpydoc: to have npy_tempita as a
 small self-contained package with the repo under the numpy
 organization and include it via a git submodule. Chuck, do you think
 tempita would need much in terms of maintenance?

 To put some money where my mouth is, I can offer to do some legwork
 for packaging it up.


>>> It might be better to keep tempita and cythonize together so that the
>>> search path works out right. It is also possible that other scripts might
>>> be wanted as cythonize is currently restricted to cython files (*.pyx.in,
>>> *.pxi.in). There are two other templating scripts in numpy/distutils,
>>> and I think f2py has a dependency on one of those.
>>>
>>> If there is a set of tools that would be common to both scipy and numpy,
>>> having them included as a submodule would be a good idea.
>>>
>>>
>> Hmm, I suppose it just depends on where submodule is, so a npy_tempita
>> alone would work fine.  There isn't much maintenance needed if you resist
>> the urge to refactor the code. I removed a six dependency, but that is now
>> upstream as well.
>>
>
> There don't seem to be any objections, so I will put the current
> vendorization in.
>

LGTM as is. tools/ seems to be the right place, its outside the numpy
package so no one can import it as numpy.something, which is better than a
numpy.externals or numpy.vendor submodule.


> Evgeni, if you think it a good idea to make a repo for this and use
> submodules, go ahead with that. I have left out the testing infrastructure
> at https://github.com/gjhiggins/tempita which runs a sparse set of
> doctests.
>

There's a problem with that: it will then not be possible to do:
git clone ...
python setup.py install  # or equivalent
We shouldn't force everyone to mess with "git submodule" for this. I
suspect a submodule would also break a pip install directly from github.
There'll be very few (if any) changes from upstream Tempita, so making a
reusable npy_tempita seems premature.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Fwd: [NumFOCUS Projects] Job Posting

2016-09-27 Thread Ralf Gommers
Hi all,

The below job posting may be of interest. It's actually a key role for a
quite exciting program, could really have an impact on a large set of
scientific computing projects.

Cheers,
Ralf



-- Forwarded message --
From: Leah Silen 
Date: Thu, Sep 22, 2016 at 6:05 AM
Subject: [NumFOCUS Projects] Job Posting
To: proje...@numfocus.org


The last project email contained information on both the Summit and the
Sustainability Program Job Posting. In case you missed it, we wanted to
resend the job posting info. Please share with those in your circles.

Best,

Leah



NumFOCUS Projects Director

NumFOCUS is seeking to hire a full-time Projects Director to develop and
run a sustainability incubator program for NumFOCUS fiscally sponsored open
source projects. This is the first program of its kind, with opportunity
for enormous impact on the open source ecosystem. The Projects Director
will work with NumFOCUS fiscally sponsored Open Source projects to develop
appropriate business development strategies and opportunities to engage
with industry. The learnings from this program will be public, meaning it
has the potential to change how all open source projects are managed.

NumFOCUS is 501(c)3 whose mission is to promote sustainable high-level
programming languages, open code development, and reproducible scientific
research. They accomplish this mission through their educational programs
and events as well as through fiscal sponsorship of open source data
science projects. They aim to increase collaboration and communication
within the scientific computing community.

The Projects Director position is initially funded for 2 years through a
grant from the Sloan Foundation. The incumbent will be hired as an employee
of NumFOCUS. Review of applications will begin October 10th, and the
position will remain open until filled.

Job description

The Projects Director will lead efforts to identify skills and perspectives
that Open Source Project leads need to establish sustainability models and
seek opportunities for funding and will consult with NumFOCUS projects and
disseminate outcomes.

Responsibilities Include:

   -

   Organize a ‘business bootcamp’ workshop, coach project leads and lead an
   Advisory Board for these business development efforts
   -

   Develop business and financial plans, improve communication, build
   industry relationships, and broadly develop strategies for fostering these
   relationships.
   -

   Lead efforts to identify skills and perspectives that Open Source
   Project leads need to establish sustainability models and seek
   opportunities for funding
   -

   Using materials from workshops and other resources, develop and
   disseminate an Open Source Nurturing and Actionable Practices Kit (OSNAP
   Kit) to provide guidance to open source projects outside of NumFOCUS
   -

   Establish assessment strategies to measure impact and help NumFOCUS
   open-source projects develop industry relationships


Qualifications

As this role provides a business, operations and marketing perspective to
Open Source project leads, the candidate should have:

   -

   Preferred: 4 years experience in business development, operations and/or
   marketing
   -

   Strong oral and written communication skills
   -

   Teaching experience of some capacity, e.g. teaching at universities or
   creating and running workshops
   -

   Demonstrated ability to interact with a broad range of people from
   technical leads to industry leaders
   -

   Experience working with open communities, e.g. open source software,
   hardware or makers,
   -

   A rich network of experts in business and technology to draw ideas and
   collaboration from.


Salary

NumFOCUS offers a competitive salary commensurate with experience and a
comprehensive benefits package.

Applications

To apply, please submit your resume/CV and cover letter (up to two pages)
to NumFOCUS at: projects_direc...@numfocus.org

See the posting at: http://www.numfocus.org/blog/
projects-director-job-posting



---
Leah Silen
Executive Director, NumFOCUS
l...@numfocus.org
512-222-5449


 

-- 
You received this message because you are subscribed to the Google Groups
"Fiscally Sponsored Project Representatives" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to projects+unsubscr...@numfocus.org.
To post to this group, send email to proje...@numfocus.org.
Visit this group at https://groups.google.com/a/numfocus.org/group/projects/
.
To view this discussion on the web visit https://groups.google.com/a/
numfocus.org/d/msgid/projects/CALWv6u%2BkwhK_RrFq7bCfMq9mB36o5FjGJjBBGo0yJ1
jzPLsmvw%40mail.gmail.com

.

Re: [Numpy-discussion] mail.scipy.org update

2016-09-26 Thread Ralf Gommers
On Mon, Sep 26, 2016 at 10:20 PM, Didrik Pinte  wrote:

> If all the SSL certification updates have been done properly, this message
> should go through.
>

It did, thanks Didrik!

Ralf


>
> -- Didrik
>
> On 14 September 2016 at 13:00, Didrik Pinte  wrote:
>
>> Hi everyone,
>>
>> While updating the scipy SSL certificates yesterday, it appeared that
>> filesystem of the servers is corrupted (more than likely a hardware
>> failure). The problem is restricted to one volume and impacts only the web
>> services. The mailing list/mailman service works as expected.
>>
>> We're working on restoring all the different non-functional services.
>>
>> Thanks for you patience!
>>
>> -- Didrik
>>
>
>
>
> --
> Didrik Pinte +32 475 665 668
>+44 1223 969515
> Enthought Inc.dpi...@enthought.com
> Scientific Computing Solutions   http://www.enthought.com
>
> The information contained in this message is Enthought confidential & not
> to be dissiminated to outside parties without explicit prior approval from
> sender.  This message is intended solely for the addressee(s), If you are
> not the intended recipient, please contact the sender by return e-mail and
> destroy all copies of the original message.
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] weighted random choice in Python

2016-08-16 Thread Ralf Gommers
On Tue, Aug 16, 2016 at 7:04 AM, Alan Isaac  wrote:

> It seems that there may soon be movement on this enhancement request:
> http://bugs.python.org/issue18844
> The API is currently under discussion.
> If this decision might interact with NumPy in any way,
> it would be good to have that documented now.
> (See Raymond Hettinger's comment today.)
>

Doesn't look like it interacts with numpy.random. The whole enhancement
request doesn't look very interesting imho.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy in PyPy

2016-08-06 Thread Ralf Gommers
On Sat, Aug 6, 2016 at 9:20 AM, Benjamin Root  wrote:

> Don't know if it is what you are looking for, but NumPy has a built-in
> suite of benchmarks: http://docs.scipy.org/doc/numpy/reference/generated/
> numpy.testing.Tester.bench.html
>

That's the very old (now unused) benchmark runner. Numpy has had an ASV
test suite for a while, see
https://github.com/numpy/numpy/tree/master/benchmarks for how to run it.


> Also, some projects have taken to utilizing the "airspeed velocity"
> utility to track benchmarking stats for their projects. I know astropy
> utilizes it. So, maybe their benchmarks might be a good starting point
> since they utilize numpy heavily?
>
> Cheers!
> Ben Root
>
>
> On Fri, Aug 5, 2016 at 3:42 AM, Papa, Florin 
> wrote:
>
>> Hi,
>>
>>
>>
>> This is Florin Papa from the Dynamic Scripting Languages Optimizations
>> team in Intel Corporation.
>>
>>
>>
>> Our team is working on optimizing the PyPy interpreter and part of this
>> work is to find and fix incompatibilities between NumPy and PyPy. Does
>> anyone have knowledge of real life workloads that use NumPy and cannot be
>> run using PyPy?
>>
>>
>>
>> We are also interested in creating a repository with relevant benchmarks
>> for real world usage of NumPy, like GUPB for CPython, but we have not found
>> such workloads for NumPy.
>>
>
The approach of GUPB is interesting (the whole application part that is,
the rest looks much more cumbersome than ASV benchmarks), but of course
easier to create for Python than for Numpy. You'd need to find whole
applications that spend most of their time in numpy but not in too small a
set of numpy functions. Maybe benchmark suites of other projects aren't
such a bad idea for that. Or spend a bit of time collecting relevant
published ipython notebooks.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Euroscipy

2016-08-03 Thread Ralf Gommers
On Tue, Aug 2, 2016 at 7:22 PM, Sebastian Berg 
wrote:

> Hi all,
>
> I am still pondering whether or not (and if which days) to go to
> EuroScipy. Who else is there and would like to discuss a bit or
> whatever else?
>

Unfortunately I'm not able to go this year.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] runtests.py fails f2py test

2016-07-19 Thread Ralf Gommers
On Fri, Jul 15, 2016 at 3:50 PM, Matti Picus  wrote:

> Am I missing something simple? I
> - install git, subversion, gcc, gfortran (Ubuntu 16.04)
> - create a clean python2 virtual env (no numpy)
> - activate it
> - git clone numpy
> - cd into it
> - python runtests.py
> - wait
> And it fails tests because it cannot find f2py.
>

I can confirm that runtests.py doesn't install f2py into the virtualenv bin
directory. This issue is hard to run into though, because if you've ever
installed numpy on your system before (outside a virtualenv), then you
probably already have f2py on your PATH (it's at .local/bin/f2py).


>
> Then I
> - python setup.py install # to install numpy
> - cd numpy/f2py
> - python setup.py build
> And setup fails:
>
> F2PY Version 2
> Traceback (most recent call last):
>   File "setup.py", line 117, in 
> **config)
> TypeError: setup() got multiple values for keyword argument 'version'
>
>
f2py isn't supposed to be installed standalone, I'm not surprised that that
doesn't work.

Ralf



> Can someone give me a one-line hint what I am doing wrong?
> Matti
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Performance issue in covariance function in Numpy 1.9 and later

2016-07-19 Thread Ralf Gommers
On Tue, Jul 19, 2016 at 3:53 PM, Ecem sogancıoglu  wrote:

> Hello All,
>
> there seems to be a performance issue with the covariance function in
> numpy 1.9 and later.
>
> Code example:
> *np.cov(np.random.randn(700,37000))*
>
> In numpy 1.8, this line of code requires 4.5755 seconds.
> In numpy 1.9 and later, the same line of code requires more than 30.3709 s
> execution time.
>

Hi Ecem, can you make sure to use the exact same random array as input to
np.cov when testing this? Also timing just the function call you're
interested in would be good; the creating of your 2-D array takes longer
than the np.cov call:

In [5]: np.random.seed(1234)

In [6]: x = np.random.randn(700,37000)

In [7]: %timeit np.cov(x)
1 loops, best of 3: 572 ms per loop

In [8]: %timeit np.random.randn(700, 37000)
1 loops, best of 3: 1.26 s per loop


Cheers,
Ralf



> Has anyone else observed this problem and is there a known bugfix?
>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Custom Dtype/Units discussion

2016-07-12 Thread Ralf Gommers
On Tue, Jul 12, 2016 at 7:56 AM, Travis Oliphant 
wrote:

>
>
>
> http://www.continuum.io
> On Mon, Jul 11, 2016 at 12:58 PM, Charles R Harris <
> charlesr.har...@gmail.com> wrote:
>
>>
>>
>> On Mon, Jul 11, 2016 at 11:39 AM, Chris Barker 
>> wrote:
>>
>>>
>>>
>>> On Sun, Jul 10, 2016 at 8:12 PM, Nathan Goldbaum 
>>> wrote:
>>>

 Maybe this can be an informal BOF session?

>>>
>>> or  maybe a formal BoF? after all, how formal do they get?
>>>
>>> Anyway, it was my understanding that we really needed to do some
>>> significant refactoring of how numpy deals with dtypes in order to do this
>>> kind of thing cleanly -- so where has that gone since last year?
>>>
>>> Maybe this conversation should be about how to build a more flexible
>>> dtype system generally, rather than specifically about unit support.
>>> (though unit support is a great use-case to focus on)
>>>
>>
>> Note that Mark Wiebe will also be giving a talk Friday, so he may be
>> around. As the last person to add a type to Numpy and the designer of DyND
>> he might have some useful input. DyND development is pretty active and I'm
>> always curious how we can somehow move in that direction.
>>
>>
> There has been a lot of work over the past 6 months on making DyND
> implement the "pluribus" concept that I have talked about briefly in the
> past.   DyND now has a separate C++ ndt data-type library.  The Python
> interface to that type library is still unified in the dynd module but it
> is separable and work is in progress to make a separate Python-wrapper to
> this type library.The dynd type library is datashape described at
> http://datashape.pydata.org
>
> This type system is extensible and could be the foundation of a
> re-factored NumPy.  My view (and what I am encouraging work in the
> direction of) is that array computing in Python should be refactored into a
> "type-subsystem"  (I think ndt is the right model there), a generic
> ufunc-system (I think dynd has a very promising approach there as well),
> and then a container (the memoryview already in Python might be enough
> already).  These modules could be separately installed, maintained and
> eventually moved into Python itself.
>
> Then, a potential future NumPy project could be ported to be a layer of
> calculations and connections to other C-libraries on-top of this system.
> Many parts of the current code could be re-used in that effort --- or the
> new system could be part of a re-factoring of NumPy to make the innards of
> NumPy more accessible to a JIT compiler.
>
> We are already far enough along that this could be pursued with a
> motivated person.   It would take 18 months to complete the system but
> first-light would be less than 6 months for a dedicated, motivated, and
> talented resource.   DyND is far enough along as well as Cython and/or
> Numba to make this pretty straight-forward.For this re-factored
> array-computing project to take the NumPy name, this community would have
> to decide that that is the right thing to do. But, other projects like
> Pandas and/or xarray and/or numpy-py and/or NumPy on Jython could use this
> sub-system also.
>
> It has taken me a long time to actually get to the point where I would
> recommend a specific way forward.   I have thought about this for many
> years and don't make these recommendations lightly.The pluribus concept
> is my recommendation about what would be best now and in the future --- and
> I will be pursuing this concept and working to get to a point where this
> community will accept it if possible because it would be ideal if this new
> array library were still called NumPy.
>
> My working view is that someone will have to build the new prototype NumPy
> for the community to evaluate whether it's the right approach and get
> consensus that it is the right way forward.There is enough there now
> with DyND, data-shape, and Numba/Cython to do this fairly quickly. It
> is not strictly necessary to use DyND or Numba or even data-shape to
> accomplish this general plan --- but these are already available and a
> great place to start as they have been built explicitly with the intention
> of improving array-computing in Python.
>
> This potential NumPy could be backwards compatible from an API perspective
> (including a C-API) --- though recompliation would be necessary and there
> would be some semantic differences in corner-cases that could either be
> fixed where necessary but potentially just made part of the new version.
>
> I will be at the Continuum Happy hour on Thursday at our offices and
> welcome anyone to come discuss things with me there --- I am also willing
> to meet with anyone on Thursday and Friday if I can --- but I don't have a
> ticket to ScPy itself. Please CC me directly if you have questions.   I
> try to follow the numpy-discussion mailing list but I am not always
> successful at keeping 

Re: [Numpy-discussion] Remove FCompiler flags?

2016-07-07 Thread Ralf Gommers
On Thu, Jul 7, 2016 at 12:43 AM, Rob Nagler  wrote:

> I would like to remove "-fno-second-underscore" and "-Wall" from the
> fcompiler. I had hacked a build_clib subclass to remove the flags
> dynamically, but that is fragile (and is now broken). Is there a right way
> to remove flags?
> 
>

No good way to do that I'm afraid, every method is pretty fragile. If it's
on my own machine, I would just remove the flags from fcompiler/gnu.py by
hand. If it needs to be dynamically, then I'd probably go for
monkeypatching the GnuFCompiler/Gnu95FCompiler classes from that package
instead of a command subclass.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Added atleast_nd, request for clarification/cleanup of atleast_3d

2016-07-06 Thread Ralf Gommers
On Wed, Jul 6, 2016 at 6:26 PM, Nathaniel Smith <n...@pobox.com> wrote:

On Jul 5, 2016 11:21 PM, "Ralf Gommers" <ralf.gomm...@gmail.com> wrote:
> >
> >
> >
> > On Wed, Jul 6, 2016 at 7:06 AM, Nathaniel Smith <n...@pobox.com> wrote:
> >
> >> On Jul 5, 2016 9:09 PM, "Joseph Fox-Rabinovitz" <
> jfoxrabinov...@gmail.com> wrote:
> >> >
> >> > Hi,
> >> >
> >> > I have generalized np.atleast_1d, np.atleast_2d, np.atleast_3d with a
> >> > function np.atleast_nd in PR#7804
> >> > (https://github.com/numpy/numpy/pull/7804).
> >> >
> >> > As a result of this PR, I have a couple of questions about
> >> > `np.atleast_3d`. `np.atleast_3d` appears to do something weird with
> >> > the dimensions: If the input is 1D, it prepends and appends a size-1
> >> > dimension. If the input is 2D, it appends a size-1 dimension. This is
> >> > inconsistent with `np.atleast_2d`, which always prepends (as does
> >> > `np.atleast_nd`).
> >> >
> >> >   - Is there any reason for this behavior?
> >> >   - Can it be cleaned up (e.g., by reimplementing `np.atleast_3d` in
> >> > terms of `np.atleast_nd`, which is actually much simpler)? This would
> >> > be a slight API change since the output would not be exactly the same.
> >>
> >> Changing atleast_3d seems likely to break a bunch of stuff...
> >>
> >> Beyond that, I find it hard to have an opinion about the best design
> for these functions, because I don't think I've ever encountered a
> situation where they were actually what I wanted. I'm not a big fan of
> coercing dimensions in the first place, for the usual "refuse to guess"
> reasons. And then generally if I do want to coerce an array to another
> dimension, then I have some opinion about where the new dimensions should
> go, and/or I have some opinion about the minimum acceptable starting
> dimension, and/or I have a maximum dimension in mind. (E.g. "coerce 1d
> inputs into a column matrix; 0d or 3d inputs are an error" -- atleast_2d is
> zero-for-three on that requirements list.)
> >>
> >> I don't know how typical I am in this. But it does make me wonder if
> the atleast_* functions act as an attractive nuisance, where new users take
> their presence as an implicit recommendation that they are actually a
> useful thing to reach for, even though they... aren't that. And maybe we
> should be recommending folk move away from them rather than trying to
> extend them further?
> >>
> >> Or maybe they're totally useful and I'm just missing it. What's your
> use case that motivates atleast_nd?
> >
> > I think you're just missing it:) atleast_1d/2d are used quite a bit in
> Scipy and Statsmodels (those are the only ones I checked), and in the large
> majority of cases it's the best thing to use there. There's a bunch of
> atleast_2d calls with a transpose appended because the input needs to be
> treated as columns instead of rows, but that's still efficient and readable
> enough.
>
> I know people *use* it :-). What I'm confused about is in what situations
> you would invent it if it didn't exist. Can you point me to an example or
> two where it's "the best thing"? I actually had statsmodels in mind with my
> example of wanting the semantics "coerce 1d inputs into a column matrix; 0d
> or 3d inputs are an error". I'm surprised if there are places where you
> really want 0d arrays converted into 1x1,
>
Scalar to shape (1,1) is less common, but 1-D to 2-D or scalar to shape
(1,) is very common. Example is at the top of scipy/stats/stats.py: the
_chk_asarray functions (used in many other functions) take care to never
return scalar arrays because those are plain annoying to deal with. If that
sounds weird to you, you're probably one of those people who was never
surprised by this:

In [3]: x0 = np.array(1)

In [4]: x1 = np.array([1])

In [5]: x0[0]
---
IndexErrorTraceback (most recent call last)
 in ()
> 1 x0[0]

IndexError: too many indices for array

In [6]: x1[0]
Out[6]: 1

or want to allow high dimensional arrays to pass through - and if you do
> want to allow high dimensional arrays to pass through, then transposing
> might help with 2d cases but will silently mangle high-d cases, right?
>
>2d input handling is usually irrelevant. The vast majority of cases is
"function that accepts scalar and 1-D array" or "function that accepts 1-D
and 2-D arrays".

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Added atleast_nd, request for clarification/cleanup of atleast_3d

2016-07-06 Thread Ralf Gommers
On Wed, Jul 6, 2016 at 7:06 AM, Nathaniel Smith  wrote:

On Jul 5, 2016 9:09 PM, "Joseph Fox-Rabinovitz" 
> wrote:
> >
> > Hi,
> >
> > I have generalized np.atleast_1d, np.atleast_2d, np.atleast_3d with a
> > function np.atleast_nd in PR#7804
> > (https://github.com/numpy/numpy/pull/7804).
> >
> > As a result of this PR, I have a couple of questions about
> > `np.atleast_3d`. `np.atleast_3d` appears to do something weird with
> > the dimensions: If the input is 1D, it prepends and appends a size-1
> > dimension. If the input is 2D, it appends a size-1 dimension. This is
> > inconsistent with `np.atleast_2d`, which always prepends (as does
> > `np.atleast_nd`).
> >
> >   - Is there any reason for this behavior?
> >   - Can it be cleaned up (e.g., by reimplementing `np.atleast_3d` in
> > terms of `np.atleast_nd`, which is actually much simpler)? This would
> > be a slight API change since the output would not be exactly the same.
>
> Changing atleast_3d seems likely to break a bunch of stuff...
>
> Beyond that, I find it hard to have an opinion about the best design for
> these functions, because I don't think I've ever encountered a situation
> where they were actually what I wanted. I'm not a big fan of coercing
> dimensions in the first place, for the usual "refuse to guess" reasons. And
> then generally if I do want to coerce an array to another dimension, then I
> have some opinion about where the new dimensions should go, and/or I have
> some opinion about the minimum acceptable starting dimension, and/or I have
> a maximum dimension in mind. (E.g. "coerce 1d inputs into a column matrix;
> 0d or 3d inputs are an error" -- atleast_2d is zero-for-three on that
> requirements list.)
>
> I don't know how typical I am in this. But it does make me wonder if the
> atleast_* functions act as an attractive nuisance, where new users take
> their presence as an implicit recommendation that they are actually a
> useful thing to reach for, even though they... aren't that. And maybe we
> should be recommending folk move away from them rather than trying to
> extend them further?
>
> Or maybe they're totally useful and I'm just missing it. What's your use
> case that motivates atleast_nd?
>
I think you're just missing it:) atleast_1d/2d are used quite a bit in
Scipy and Statsmodels (those are the only ones I checked), and in the large
majority of cases it's the best thing to use there. There's a bunch of
atleast_2d calls with a transpose appended because the input needs to be
treated as columns instead of rows, but that's still efficient and readable
enough.

For 3D/nD I can see that you'd need more control over where the dimensions
go, but 1D/2D are fine.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] f2py output module name

2016-07-05 Thread Ralf Gommers
On Tue, Jul 5, 2016 at 9:18 PM, klo uo  wrote:

> Hi, I'm following this guide:
> http://docs.scipy.org/doc/numpy-dev/f2py/getting-started.html#the-quick-and-smart-way
>
> I'm on Windows with gfortran and VS2015. When I run:
>
> f2py -c -m fib3 fib3.f
>
> as output I dont get "fib3.pyd", but "fib3.cp35-win_amd64.pyd".
>
> Does anyone know how to get correctly named module in this case?
>

That is the right output, see https://www.python.org/dev/peps/pep-3149. You
can check the expected extension tag with
`sysconfig.get_config_var('SOABI')`.

There may be a way to turn that off, but why would you?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Accelerate or OpenBLAS for numpy / scipy wheels?

2016-07-01 Thread Ralf Gommers
On Wed, Jun 29, 2016 at 11:06 PM, Sturla Molden <sturla.mol...@gmail.com>
wrote:

> Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
> > For most routines performance seems to be comparable, and both are much
> > better than ATLAS. When there's a significant difference, I have the
> > impression that OpenBLAS is more often the slower one (example:
> >  > href="https://github.com/xianyi/OpenBLAS/issues/533;>
> https://github.com/xianyi/OpenBLAS/issues/533).
>
> Accelerate is in general better optimized for level-1 and level-2 BLAS than
> OpenBLAS. There are two reasons for this:
>
> First, OpenBLAS does not use AVX for these kernels, but Accelerate does.
> This is the more important difference. It seems the OpenBLAS devs are now
> working on this.
>
> Second, the thread pool in OpenBLAS is not as scalable on small tasks as
> the "Grand Central Dispatch" (GCD) used by Accelerate. The GCD thread-pool
> used by Accelerate is actually quite unique in having a very tiny overhead:
> It takes only 16 extra opcodes (IIRC) for running a task on the global
> parallel queue instead of the current thread. (Even if my memory is not
> perfect and it is not exactly 16 opcodes, it is within that order of
> magnitude.) GCD can do this because the global queues and threadpool is
> actually built into the kernel of the OS. On the other hand, OpenBLAS and
> MKL depends on thread pools managed in userspace, for which the scheduler
> in the OS have no special knowledge. When you need fine-grained parallelism
> and synchronization, there is nothing like GCD. Even a user-space spinlock
> will have bigger overhead than a sequential queue in GCD. With a userspace
> threadpool all threads are scheduled on a round robin basis, but with GCD
> the scheduler has special knowledge about the tasks put on the queues, and
> executes them as fast as possible. Accelerate therefore has an unique
> advantage when running level-1 and 2 BLAS routines, with which OpenBLAS or
> MKL probably never can properly compete. Programming with GCD can actually
> often be counter-intuitive to someone used to deal with OpenMP, MPI or
> pthreads. For example it is often better to enqueue a lot of small tasks
> instead of splitting up the computation into large chunks of work. When
> parallelising a tight loop, a chunk size of 1 can be great on GCD but is
> likely to be horrible on OpenMP and anything else that has userspace
> threads.
>

Thanks Sturla, interesting details as always. You didn't state your
preference by the way, do you have one?

We're building binaries for the average user, so I'd say the AVX thing is
of relevance for the decision to be made, the GCD one less so (people who
care about that will not have any trouble building their own numpy).

So far the score is: one +1, one +0.5, one +0, one -1 and one "still a bit
nervous". Any other takers?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Is numpy.test() supposed to be multithreaded?

2016-06-29 Thread Ralf Gommers
On Wed, Jun 29, 2016 at 3:27 AM, Chris Barker - NOAA Federal <
chris.bar...@noaa.gov> wrote:

>
>
>> Now the user is writing back to say, "my test code is fast now, but
>> numpy.test() is still about three times slower than > don't manage>".  When I watch htop as numpy.test() executes, sure enough,
>> it's using one core
>>
>
> * if numpy.test() is supposed to be using multiple cores, why isn't it,
>> when we've established with other test code that it's now using multiple
>> cores?
>>
>
> Some numpy.linalg functions (like np.dot) will be using multiple cores,
> but np.linalg.test() takes only ~1% of the time of the full test suite.
> Everything else will be running single core. So your observations are not
> surprising.
>
>
> Though why it would run slower on one box than another comparable box is a
> mystery...
>

Maybe just hardware config? I see a similar difference between how long the
test suite runs on TravisCI vs my linux desktop (the latter is slower,
surprisingly).

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Is numpy.test() supposed to be multithreaded?

2016-06-28 Thread Ralf Gommers
On Tue, Jun 28, 2016 at 10:36 PM, Michael Ward  wrote:

> Heya, I'm not a numbers guy, but I maintain servers for scientists and
> researchers who are.  Someone pointed out that our numpy installation on a
> particular server was only using one core.  I'm unaware of the who/how the
> previous version of numpy/OpenBLAS were installed, so I installed them from
> scratch, and confirmed that the users test code now runs on multiple cores
> as expected, drastically increasing performance time.
>
> Now the user is writing back to say, "my test code is fast now, but
> numpy.test() is still about three times slower than  don't manage>".  When I watch htop as numpy.test() executes, sure enough,
> it's using one core.  Now I'm not sure if that's the expected behavior or
> not.  Questions:
>
> * if numpy.test() is supposed to be using multiple cores, why isn't it,
> when we've established with other test code that it's now using multiple
> cores?
>

Some numpy.linalg functions (like np.dot) will be using multiple cores, but
np.linalg.test() takes only ~1% of the time of the full test suite.
Everything else will be running single core. So your observations are not
surprising.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Accelerate or OpenBLAS for numpy / scipy wheels?

2016-06-28 Thread Ralf Gommers
On Tue, Jun 28, 2016 at 5:50 PM, Chris Barker  wrote:

>
>
>
>
> > This doesn't really matter too much imho, we have to support Accelerate
>> > either way.
>>
>
> do we? -- so if we go OpenBlas, and someone want to do a simple build from
> source, what happens? Do they get accelerate?
>

Indeed, unless they go through the effort of downloading a separate BLAS
and LAPACK, and figuring out how to make that visible to numpy.distutils.
Very few users will do that.


> or would we ship OpenBlas source itself?
>

Definitely don't want to do that.


> or would they need to install OpenBlas some other way?
>

Yes, or MKL, or ATLAS, or BLIS. We have support for all these, and that's a
good thing. Making a uniform choice for our official binaries on various
OSes doesn't reduce the need or effort for supporting those other options.


>
> >> >> Faster to fix bugs with good support from main developer.  No
>> >> >> multiprocessing crashes for Python 2.7.
>>
>
> this seems to be the compelling one.
>
> How does the performance compare?
>

For most routines performance seems to be comparable, and both are much
better than ATLAS. When there's a significant difference, I have the
impression that OpenBLAS is more often the slower one (example:
https://github.com/xianyi/OpenBLAS/issues/533).

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Accelerate or OpenBLAS for numpy / scipy wheels?

2016-06-28 Thread Ralf Gommers
On Tue, Jun 28, 2016 at 2:55 PM, Matthew Brett 
wrote:

> Hi,
>
> On Tue, Jun 28, 2016 at 5:25 AM, Charles R Harris
>  wrote:
> >
> >
> > On Mon, Jun 27, 2016 at 9:46 PM, Matthew Brett 
> > wrote:
> >>
> >> Hi,
> >>
> >> I just succeeded in getting an automated dual arch build of numpy and
> >> scipy, using OpenBLAS.  See the last three build jobs in these two
> >> build matrices:
> >>
> >> https://travis-ci.org/matthew-brett/numpy-wheels/builds/140388119
> >> https://travis-ci.org/matthew-brett/scipy-wheels/builds/140684673
> >>
> >> Tests are passing on 32 and 64-bit.
> >>
> >> I didn't upload these to the usual Rackspace container at
> >> wheels.scipy.org to avoid confusion.
> >>
> >> So, I guess the question now is - should we switch to shipping
> >> OpenBLAS wheels for the next release of numpy and scipy?  Or should we
> >> stick with the Accelerate framework that comes with OSX?
> >>
> >> In favor of the Accelerate build : faster to build, it's what we've
> >> been doing thus far.
>

Faster to build isn't really an argument right? Should be the same build
time except for building OpenBLAS itself once per OpenBLAS version. And
only applies to building wheels for releases - nothing changes for source
builds done by users on OS X. If build time ever becomes a real issue, then
dropping the dual arch stuff is probably the way to go - the 32-bit builds
make very little sense these days.

What we've been doing thus far - that is the more important argument.
There's a risk in switching, we may encounter new bugs or lose some
performance in particular functions.


> >>
> >> In favor of OpenBLAS build : allows us to commit to one BLAS / LAPACK
> >> library cross platform,
>

This doesn't really matter too much imho, we have to support Accelerate
either way.


> when we have the Windows builds working.
> >> Faster to fix bugs with good support from main developer.  No
> >> multiprocessing crashes for Python 2.7.
>

This is probably the main reason to make the switch, if we decide to do
that.


>   I'm still a bit nervous about OpenBLAS, see
> > https://github.com/scipy/scipy/issues/6286. That was with version
> 0.2.18,
> > which is pretty recent.
>
> Well - we are committed to OpenBLAS already for the Linux wheels, so
> if that failure was due to an error in OpenBLAS, we'll have to report
> it and get it fixed / fix it ourselves upstream.
>

Indeed. And those wheels have been downloaded a lot already, without any
issues being reported.

I'm +0 on the proposal - the risk seems acceptable, but the reasons to make
the switch are also not super compelling.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Pip download stats for numpy

2016-06-26 Thread Ralf Gommers
On Sat, Jun 25, 2016 at 12:25 AM, Matthew Brett 
wrote:

> Hi,
>
> I just ran a query on pypi downloads [1] using the BigQuery interface
> to pypi stats [2].  It lists the numpy files downloaded from pypi via
> a pip install, over the last two weeks, ordered by the number of
> downloads:
>
> 1  100595 numpy-1.11.0.tar.gz
> 2  97754 numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
> 3  38471 numpy-1.8.1-cp27-cp27mu-manylinux1_x86_64.whl
> 4  20874 numpy-1.11.0-cp27-none-win_amd64.whl
> 5  20049
> numpy-1.11.0-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
> 6  17100 numpy-1.10.4-cp27-cp27mu-manylinux1_x86_64.whl
> 7  15187 numpy-1.10.1.zip
> 8  14277 numpy-1.11.0-cp35-cp35m-manylinux1_x86_64.whl
> 9  11538 numpy-1.9.1.tar.gz
> 10 11272 numpy-1.11.0-cp27-none-win32.whl
>

Thanks Matthew, interesting.


> Of course, it's difficult to know how many of these are from automated
> builds, such as from travis-ci, but it does look as if manylinux
> wheels are getting some traction.
>

Looks like the vast majority is from CI setups, but that's still a lot of
time not spent building numpy from source so also a good thing.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Deprecating silent truncation of floats when assigned to int array

2016-06-17 Thread Ralf Gommers
On Tue, Jun 14, 2016 at 12:23 AM, Ian Henriksen <
insertinterestingnameh...@gmail.com> wrote:

> Personally, I think this is a great idea. +1 to more informative errors.
>

+1 from me as well

Ralf



> Best,
> Ian Henriksen
>
> On Mon, Jun 13, 2016 at 2:11 PM Nathaniel Smith  wrote:
>
>> It was recently pointed out:
>>
>>   https://github.com/numpy/numpy/issues/7730
>>
>> that this code silently truncates floats:
>>
>> In [1]: a = np.arange(10)
>>
>> In [2]: a.dtype
>> Out[2]: dtype('int64')
>>
>> In [3]: a[3] = 1.5
>>
>> In [4]: a[3]
>> Out[4]: 1
>>
>> The proposal is that we should deprecate this, and eventually turn it
>> into an error. Any objections?
>>
>> We recently went through a similar deprecation cycle for in-place
>> operations, i.e., this used to silently truncate but now raises an
>> error:
>>
>> In [1]: a = np.arange(10)
>>
>> In [2]: a += 1.5
>>
>> ---
>> TypeError Traceback (most recent call
>> last)
>>  in ()
>> > 1 a += 1.5
>>
>> TypeError: Cannot cast ufunc add output from dtype('float64') to
>> dtype('int64') with casting rule 'same_kind'
>>
>> so the proposal here is to extend this to regular assignment.
>>
>> -n
>>
>> --
>> Nathaniel J. Smith -- https://vorpus.org
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@scipy.org
>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] scipy.stats.qqplot and scipy.stats.probplot axis labeling

2016-06-12 Thread Ralf Gommers
On Sat, Jun 11, 2016 at 2:53 PM, Ralf Gommers <ralf.gomm...@gmail.com>
wrote:

> Hi Mark,
>
> Note that the scipy-dev or scipy-user mailing list would have been more
> appropriate for this question.
>
>
> On Fri, Jun 10, 2016 at 9:06 AM, Mark Gawron <gaw...@mail.sdsu.edu> wrote:
>
>>
>>
>> The scipy.stats.qqplot and scipy.stats.probplot  functions plot expected
>> values versus actual data values for visualization of fit to a
>> distribution.  First a one-D array of expected percentiles is generated for
>>  a sample of size N; then that is passed to  dist.ppf, the per cent point
>> function for the chosen distribution, to return an array of expected
>> values.  The visualized data points are pairs of expected and actual
>> values, and a linear regression is done on these to produce the line data
>> points in this distribution should lie on.
>>
>> Where x is the input data array and dist the chosen distribution we have:
>>
>> osr = np.sort(x)
>> osm_uniform = _calc_uniform_order_statistic_medians(len(x))
>> osm = dist.ppf(osm_uniform)
>> slope, intercept, r, prob, sterrest = stats.linregress(osm, osr)
>>
>>
>> My question concerns the plot display.
>>
>> ax.plot(osm, osr, 'bo', osm, slope*osm + intercept, 'r-')
>>
>>
>> The x-axis of the resulting plot is labeled quantiles, but the xticks and
>> xticklabels produced produced by qqplot and problplot do not seem correct
>> for the their intended interpretations.  First the numbers on the x-axis do
>> not represent quantiles; the intervals between them do not in general
>> contain equal numbers of points.  For a normal distribution with sigma=1,
>> they represent standard deviations.  Changing the label on the x-axis does
>> not seem like a very good solution, because the interpretation of the
>> values on the x-axis will be different for different distributions.  Rather
>> the right solution seems to be to actually show quantiles on the x-axis.
>> The numbers on the x-axis can stay as they are, representing quantile
>> indexes, but they need to be spaced so as to show the actual division
>> points that carve the population up into  groups of the same size.  This
>> can be done in something like the following way.
>>
>
> The ticks are correct I think, but they're theoretical quantiles and not
> sample quantiles. This was discussed in [1] and is consistent with R [2]
> and statsmodels [3]. I see that we just forgot to add "theoretical" to the
> x-axis label (mea culpa). Does adding that resolve your concern?
>

Sent a PR for this: https://github.com/scipy/scipy/pull/6249

Ralf



>
> [1] https://github.com/scipy/scipy/issues/1821
> [2] http://data.library.virginia.edu/understanding-q-q-plots/
> [3]
> http://statsmodels.sourceforge.net/devel/generated/statsmodels.graphics.gofplots.qqplot.html?highlight=qqplot#statsmodels.graphics.gofplots.qqplot
>
> Ralf
>
>
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy lesson at EuroScipy2016?

2016-06-11 Thread Ralf Gommers
On Thu, Jun 9, 2016 at 11:25 PM,  wrote:

> Hi all,
>
> Recently I taught "Advanced NumPy" lesson at a Software Carpentry workshop
> [1]. It covered a review of basic operations on numpy arrays and also more
> advanced topics: indexing, broadcasting, dtypes and memory layout. I would
> greatly appreciate your feedback on the lesson materials, which are
> available on github pages [2].
>
> I am also thinking of proposing this lesson as a EuroScipy 2016 tutorial.
> Is anyone already planning to teach NumPy there? If so, would you be
> interested to team up for this lesson (as a co-instructor, helper or
> mentor)?
>

There's always a Numpy tutorial at EuroScipy. Emmanuelle (Cc'd) is the
tutorial chair, she can tell you the plan and I'm sure she appreciates your
offer of help.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] scipy.stats.qqplot and scipy.stats.probplot axis labeling

2016-06-11 Thread Ralf Gommers
Hi Mark,

Note that the scipy-dev or scipy-user mailing list would have been more
appropriate for this question.


On Fri, Jun 10, 2016 at 9:06 AM, Mark Gawron  wrote:

>
>
> The scipy.stats.qqplot and scipy.stats.probplot  functions plot expected
> values versus actual data values for visualization of fit to a
> distribution.  First a one-D array of expected percentiles is generated for
>  a sample of size N; then that is passed to  dist.ppf, the per cent point
> function for the chosen distribution, to return an array of expected
> values.  The visualized data points are pairs of expected and actual
> values, and a linear regression is done on these to produce the line data
> points in this distribution should lie on.
>
> Where x is the input data array and dist the chosen distribution we have:
>
> osr = np.sort(x)
> osm_uniform = _calc_uniform_order_statistic_medians(len(x))
> osm = dist.ppf(osm_uniform)
> slope, intercept, r, prob, sterrest = stats.linregress(osm, osr)
>
>
> My question concerns the plot display.
>
> ax.plot(osm, osr, 'bo', osm, slope*osm + intercept, 'r-')
>
>
> The x-axis of the resulting plot is labeled quantiles, but the xticks and
> xticklabels produced produced by qqplot and problplot do not seem correct
> for the their intended interpretations.  First the numbers on the x-axis do
> not represent quantiles; the intervals between them do not in general
> contain equal numbers of points.  For a normal distribution with sigma=1,
> they represent standard deviations.  Changing the label on the x-axis does
> not seem like a very good solution, because the interpretation of the
> values on the x-axis will be different for different distributions.  Rather
> the right solution seems to be to actually show quantiles on the x-axis.
> The numbers on the x-axis can stay as they are, representing quantile
> indexes, but they need to be spaced so as to show the actual division
> points that carve the population up into  groups of the same size.  This
> can be done in something like the following way.
>

The ticks are correct I think, but they're theoretical quantiles and not
sample quantiles. This was discussed in [1] and is consistent with R [2]
and statsmodels [3]. I see that we just forgot to add "theoretical" to the
x-axis label (mea culpa). Does adding that resolve your concern?

[1] https://github.com/scipy/scipy/issues/1821
[2] http://data.library.virginia.edu/understanding-q-q-plots/
[3]
http://statsmodels.sourceforge.net/devel/generated/statsmodels.graphics.gofplots.qqplot.html?highlight=qqplot#statsmodels.graphics.gofplots.qqplot

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy 1.11 docs

2016-05-29 Thread Ralf Gommers
On Sun, May 29, 2016 at 4:35 AM, Stephan Hoyer  wrote:

> These still are missing from the SciPy.org page, several months after the
> release.
>

Thanks Stephan, that needs fixing.


>
> What do we need to do to keep these updated?
>

https://github.com/numpy/numpy/blob/master/doc/HOWTO_RELEASE.rst.txt#update-docsscipyorg


> Is there someone at Enthought we should ping? Or do we really just need to
> transition to different infrastructure?
>

No, we just need to not forget:) The release manager normally does this, or
he pings someone else to do it. At the moment Pauli, Julian, Evgeni and me
have access to the server. I'll fix it up today.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.1rc1 release

2016-05-27 Thread Ralf Gommers
On Fri, May 27, 2016 at 4:42 AM, Charles R Harris  wrote:

> Hi All,
>
> I am pleased to announce the release of Numpy 1.11.1rc1.
>

Thanks Chuck!


> The sources may be found on sourceforge
>  and
> wheels for OS X, Windows, and Linux will be available on pypi sometime in
> the next few days. The pypi release is delayed due to the decision that the
> wheels should go up before the sources in order that people not get a
> source install when what they want are wheels.
>

For pre-releases that delay is maybe not necessary, because the average
user won't see those. Only if you add --pre to your pip install command
will you get them.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] axis parameter for count_nonzero

2016-05-22 Thread Ralf Gommers
On Sun, May 22, 2016 at 3:05 AM, G Young  wrote:

> Hi,
>
> I have had a PR  open (first
> draft can be found here ) for
> quite some time now that adds an 'axis' parameter to *count_nonzero*.
> While the functionality is fully in-place, very robust, and actually
> higher-performing than the original *count_nonzero* function, the
> obstacle at this point is the implementation, as most of the functionality
> is now surfaced at the Python level instead of at the C level.
>
> I have made several attempts to move the code into C to no avail and have
> not received much feedback from maintainers unfortunately to move this
> forward, so I'm opening this up to the mailing list to see what you guys
> think of the changes and whether or not it should be merged in as is or be
> tabled until a more C-friendly solution can be found.
>

The discussion is spread over several PRs/issues, so maybe a summary is
useful:

- adding an axis parameter was a feature request that was generally
approved of [1]
- writing the axis selection/validation code in C, like the rest of
count_nonzero, was preferred by several core devs
- Writing that C code turns out to be tricky. Jaime had a PR for doing this
for bincount [2], but closed it with final conclusion "the proper approach
seems to me to build some intermediate layer over nditer that abstracts the
complexity away".
- Julian pointed out that this adds a ufunc-like param, so why not add
other params like out/keepdims [3]
- Stephan points out that the current PR has quite a few branches, would
benefit from reusing a helper function (like _validate_axis, but that may
not do exactly the right thing), and that he doesn't want to merge it as is
without further input from other devs [4].

Points previously not raised that I can think of:
- count_nonzero is also in the C API [5], the axis parameter is now only
added to the Python API.
- Part of why the code in this PR is complex is to keep performance for
small arrays OK, but there's no benchmarks added or result given for the
existing benchmark [6]. A simple check with:
  x = np.arange(100)
  %timeit np.count_nonzero(x)
shows that that gets about 30x slower (330 ns vs 10.5 us on my machine).

It looks to me like performance is a concern, and if that can be resolved
there's the broader discussion of whether it's a good idea to merge this PR
at all. That's a trade-off of adding a useful feature vs. technical debt /
maintenance burden plus divergence Python/C API. Also, what do we do when
we merge this and then next week someone else sends a PR adding a keepdims
or out keyword? For these kinds of additions it would feel better if we
were sure that the new version is the final/desired one for the foreseeable
future.

Ralf


[1] https://github.com/numpy/numpy/issues/391
[2] https://github.com/numpy/numpy/pull/4330#issuecomment-77791250
[3] https://github.com/numpy/numpy/pull/7138#issuecomment-177202894
[4] https://github.com/numpy/numpy/pull/7177
[5]
http://docs.scipy.org/doc/numpy/reference/c-api.array.html#c.PyArray_CountNonzero
[6]
https://github.com/numpy/numpy/blob/master/benchmarks/benchmarks/bench_ufunc.py#L70
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] linux wheels coming soon

2016-03-24 Thread Ralf Gommers
On Thu, Mar 24, 2016 at 4:04 PM, Peter Cock 
wrote:

> Hi Nathaniel,
>
> Will you be providing portable Linux wheels aka manylinux1?
> https://www.python.org/dev/peps/pep-0513/
>
> Does this also open up the door to releasing wheels for SciPy
> too?
>

That should work just fine.


> While speeding up "pip install" would be of benefit in itself,
> I am particularly keen to see this for use within automated
> testing frameworks like TravisCI where currently having to
> install NumPy (and SciPy) from source is an unreasonable
> overhead.
>

There's already http://travis-dev-wheels.scipy.org/ (latest dev versions of
numpy and scipy) and http://travis-wheels.scikit-image.org/ (releases,
there are multiple sources for this one) for TravisCI setups to reuse.

Ralf


> Many thanks to everyone working on this,
>
> Peter
>
> On Tue, Mar 15, 2016 at 11:33 PM, Nathaniel Smith  wrote:
> > Hi all,
> >
> > Just a heads-up that we're planning to upload Linux wheels for numpy
> > to PyPI soon. Unless there's some objection, these will be using
> > ATLAS, just like the current Windows wheels, for the same reasons --
> > moving to something faster like OpenBLAS would be good, but given the
> > concerns about OpenBLAS's reliability we want to get something working
> > first and then worry about making it fast. (Plus it doesn't make sense
> > to ship different BLAS libraries on Windows versus Linux -- that just
> > multiplies our support burden for no reason.)
> >
> > -n
> >
> > --
> > Nathaniel J. Smith -- https://vorpus.org
> > ___
> > NumPy-Discussion mailing list
> > NumPy-Discussion@scipy.org
> > https://mail.scipy.org/mailman/listinfo/numpy-discussion
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] PyData Madrid

2016-02-20 Thread Ralf Gommers
On Wed, Feb 17, 2016 at 9:46 PM, Sebastian Berg 
wrote:

> On Mi, 2016-02-17 at 20:59 +0100, Jaime Fernández del Río wrote:
> > Hi all,
> >
> > I just found out there is a PyData Madrid happening in early April,
> > and it would feel wrong not to go, it being my hometown and all.
> >
> > Aside from the usual "Who else is going? We should meet!" I was also
> > thinking of submitting a proposal for a talk.  My idea was to put
> > something together on "The future of NumPy indexing" and use it as an
> > opportunity to raise awareness and hopefully gather feedback from
> > users on the proposed changes, in sort of a "if the mountain won't
> > come to Muhammad" type of thing.
> >
>
> I guess you do know my last name means mountain in german? But if
> Muhammed might come, I should really improve my arabic ;).
>
> In any case sounds good to me if you like to do it, I don't think I
> will go, though it sounds nice.
>

Sounds like a good idea to me too. I like both the concrete topic, as well
as just having a talk on Numpy at a PyData conference. In general there are
too few (if any) talks on Numpy and other core libraries at PyData and
Scipy confs I think.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numexpr-3.0 proposal

2016-02-14 Thread Ralf Gommers
On Sun, Feb 14, 2016 at 11:19 PM, Robert McLeod 
wrote:

>
> 4.) I took a stab at converting from distutils to setuputils but this
> seems challenging with numpy as a dependency. I wonder if anyone has tried
> monkey-patching so that setup.py build_ext uses distutils and then pass the
> interpreter.pyd/so as a data file, or some other such chicanery?
>

Not sure what you mean, since numpexpr already uses setuptools:
https://github.com/pydata/numexpr/blob/master/setup.py#L22. What is the
real goal you're trying to achieve?

This monkeypatching is a bad idea:
https://github.com/robbmcleod/numexpr/blob/numexpr-3.0/setup.py#L19. Both
setuptools and numpy.distutils already do that, and that's already one too
many. So you definitely don't want to add a third place You can use the
-j (--parallel) flag to numpy.distutils instead, see
http://docs.scipy.org/doc/numpy-dev/user/building.html#parallel-builds

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Suggestion: special-case np.array(range(...)) to be faster

2016-02-14 Thread Ralf Gommers
On Sun, Feb 14, 2016 at 10:36 PM, Charles R Harris <
charlesr.har...@gmail.com> wrote:

>
>
> On Sun, Feb 14, 2016 at 7:36 AM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
>
>>
>>
>> On Sun, Feb 14, 2016 at 9:21 AM, Antony Lee <antony@berkeley.edu>
>> wrote:
>>
>>> re: no reason why...
>>> This has nothing to do with Python2/Python3 (I personally stopped using
>>> Python2 at least 3 years ago.)  Let me put it this way instead: if
>>> Python3's "range" (or Python2's "xrange") was not a builtin type but a type
>>> provided by numpy, I don't think it would be controversial at all to
>>> provide an `__array__` special method to efficiently convert it to a
>>> ndarray.  It would be the same if `np.array` used a
>>> `functools.singledispatch` dispatcher rather than an `__array__` special
>>> method (which is obviously not possible for chronological reasons).
>>>
>>> re: iterable vs iterator: check for the presence of the __next__ special
>>> method (or isinstance(x, Iterable) vs. isinstance(x, Iterator) and not
>>> isinstance(x, Iterable))
>>>
>>
>> I think it's good to do something about this, but it's not clear what the
>> exact proposal is. I could image one or both of:
>>
>>   - special-case the range() object in array (and asarray/asanyarray?)
>> such that array(range(N)) becomes as fast as arange(N).
>>   - special-case all iterators, such that array(range(N)) becomes as fast
>> as deque(range(N))
>>
>
> I think the last wouldn't help much, as numpy would still need to
> determine dimensions and type.  I assume that is one of the reason sparse
> itself doesn't do that.
>

Not orders of magnitude, but this shows that there's something to optimize
for iterators:

In [1]: %timeit np.array(range(10))
100 loops, best of 3: 14.9 ms per loop

In [2]: %timeit np.array(list(range(10)))
100 loops, best of 3: 9.68 ms per loop

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] ANN: numpydoc 0.6.0 released

2016-02-14 Thread Ralf Gommers
On Sat, Feb 13, 2016 at 10:38 PM, <josef.p...@gmail.com> wrote:

>
>
> On Sat, Feb 13, 2016 at 10:03 AM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
>
>> Hi all,
>>
>> I'm pleased to announce the release of numpydoc 0.6.0. The main new
>> feature is support for the Yields section in numpy-style docstrings. This
>> is described in
>> https://github.com/numpy/numpy/blob/master/doc/HOWTO_DOCUMENT.rst.txt
>>
>> Numpydoc can be installed from PyPi:
>> https://pypi.python.org/pypi/numpydoc
>>
>
>
> Thanks,
>
> BTW: the status section in the howto still refers to the documentation
> editor, which has been retired AFAIK.
>

Thanks Josef. I sent a PR to remove that text.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Suggestion: special-case np.array(range(...)) to be faster

2016-02-14 Thread Ralf Gommers
On Sun, Feb 14, 2016 at 9:21 AM, Antony Lee  wrote:

> re: no reason why...
> This has nothing to do with Python2/Python3 (I personally stopped using
> Python2 at least 3 years ago.)  Let me put it this way instead: if
> Python3's "range" (or Python2's "xrange") was not a builtin type but a type
> provided by numpy, I don't think it would be controversial at all to
> provide an `__array__` special method to efficiently convert it to a
> ndarray.  It would be the same if `np.array` used a
> `functools.singledispatch` dispatcher rather than an `__array__` special
> method (which is obviously not possible for chronological reasons).
>
> re: iterable vs iterator: check for the presence of the __next__ special
> method (or isinstance(x, Iterable) vs. isinstance(x, Iterator) and not
> isinstance(x, Iterable))
>

I think it's good to do something about this, but it's not clear what the
exact proposal is. I could image one or both of:

  - special-case the range() object in array (and asarray/asanyarray?) such
that array(range(N)) becomes as fast as arange(N).
  - special-case all iterators, such that array(range(N)) becomes as fast
as deque(range(N))

or yet something else?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ANN: numpydoc 0.6.0 released

2016-02-13 Thread Ralf Gommers
Hi all,

I'm pleased to announce the release of numpydoc 0.6.0. The main new feature
is support for the Yields section in numpy-style docstrings. This is
described in
https://github.com/numpy/numpy/blob/master/doc/HOWTO_DOCUMENT.rst.txt

Numpydoc can be installed from PyPi: https://pypi.python.org/pypi/numpydoc

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] GSoC?

2016-02-10 Thread Ralf Gommers
On Tue, Feb 9, 2016 at 1:02 AM, Chris Barker  wrote:

> As you can see in the timeline:
>
> https://developers.google.com/open-source/gsoc/timeline
>
> We are now in the stage where mentoring organizations are getting their
> act together. So the question now is -- are there folks that want to mentor
> for numpy projects? It can be rewarding, but it's a pretty big commitment
> as well, and, I suppose depending on the project, would require some good
> knowledge of the innards of numpy -- there are not a lot of those folks out
> there that have that background.
>

Note that we have always done a combined numpy/scipy ideas page and
submission. For really good students numpy may be the right challenge, but
in general scipy is easier to get started on. So we have difficult project
ideas for both, but easy/intermediate ones will most likely be for scipy.


>
> So to students, I suggest you keep an eye out, and engage a little later
> on in the process.
>
> That being said, if you have a idea for a numpy improvement you'd like to
> work on , by all means propose it and maybe you'll get a mentor or two
> excited.
>
> -CHB
>
>
>
>
>
> On Mon, Feb 8, 2016 at 3:33 PM, SMRUTI RANJAN SAHOO 
> wrote:
>
>> sir actually i am interested  very much . so can you help me about this
>> or suggest some , so that i  can contribute .
>>
>
Hi Smruti, I suggest you look at the numpy or scipy issues on Github, and
start with one labeled "easy-fix".


>
>> Thanks  & Regards,
>> Smruti Ranjan Sahoo
>>
>> On Tue, Feb 9, 2016 at 1:58 AM, Chris Barker 
>> wrote:
>>
>>>
>>> I think the real challenge is having folks with the time to really put
>>> into mentoring, but if folks want to do it -- numpy could really benefit.
>>>
>>> Maybe as a python.org sub-project?
>>>
>>
Under the PSF umbrella has always worked very well, both in terms of
communication quality and of getting the amount of slots we wanted, so yes.


>
>>> https://wiki.python.org/moin/SummerOfCode/2016
>>>
>>> Deadlines are approaching -- so I thought I'd ping the list and see if
>>> folks are interested.
>>> ANyone interested in Google Summer of Code this year?
>>>
>>
Yes, last year we had quite a productive GSoC, so I had planned to organize
it along the same lines again (with an updated ideas page of course).

Are you maybe interested in co-organizing or mentoring Chris? Updating the
ideas page, proposal reviewing and interviewing students via video calls
can be time-consuming, and mentoring definitely is, so the more the merrier.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] GSoC?

2016-02-10 Thread Ralf Gommers
On Wed, Feb 10, 2016 at 11:48 PM, Chris Barker 
wrote:

> Thanks Ralf,
>
> Note that we have always done a combined numpy/scipy ideas page and
>> submission. For really good students numpy may be the right challenge, but
>> in general scipy is easier to get started on.
>>
>
> yup -- good idea. Is there a page ready to go, or do we need to get one
> up? (I don't even know where to put it...)
>

This is last year's page:
https://github.com/scipy/scipy/wiki/GSoC-2015-project-ideas

Some ideas have been worked on, others are still relevant. Let's copy this
page to -2016- and start editing it and adding new ideas. I'll start right
now actually.


>
>
>> Under the PSF umbrella has always worked very well, both in terms of
>> communication quality and of getting the amount of slots we wanted, so yes.
>>
>
> hmm, looking here:
>
> https://wiki.python.org/moin/SummerOfCode/2016#Sub-orgs
>
> it seems it's time to get started. and I _think_ our ideas page can go on
> that Wiki.
>
>
>> Are you maybe interested in co-organizing or mentoring Chris? Updating
>> the ideas page, proposal reviewing and interviewing students via video
>> calls can be time-consuming, and mentoring definitely is, so the more the
>> merrier.
>>
>
> I would love to help -- though I don't think I can commit to being a
> full-on mentor.
>
> If we get a couple people to agree to mentor,
>

That's always the tricky part. We normally let people indicate whether
they're interested in mentoring for specific project ideas on the ideas
page.


> then we can get ourselves setup up with the PSF.
> 
>

That's the easiest part, takes one email and one wiki page edit:)

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] GSoC?

2016-02-10 Thread Ralf Gommers
On Wed, Feb 10, 2016 at 11:55 PM, Ralf Gommers <ralf.gomm...@gmail.com>
wrote:

>
>
> On Wed, Feb 10, 2016 at 11:48 PM, Chris Barker <chris.bar...@noaa.gov>
> wrote:
>
>> Thanks Ralf,
>>
>> Note that we have always done a combined numpy/scipy ideas page and
>>> submission. For really good students numpy may be the right challenge, but
>>> in general scipy is easier to get started on.
>>>
>>
>> yup -- good idea. Is there a page ready to go, or do we need to get one
>> up? (I don't even know where to put it...)
>>
>
> This is last year's page:
> https://github.com/scipy/scipy/wiki/GSoC-2015-project-ideas
>
> Some ideas have been worked on, others are still relevant. Let's copy this
> page to -2016- and start editing it and adding new ideas. I'll start right
> now actually.
>

OK first version:
https://github.com/scipy/scipy/wiki/GSoC-2016-project-ideas
I kept some of the ideas from last year, but removed all potential mentors
as the same people may not be available this year - please re-add
yourselves where needed.

And to everyone who has a good idea, and preferably is willing to mentor
for that idea: please add it to that page.

Ralf



>
>
>>
>>
>>> Under the PSF umbrella has always worked very well, both in terms of
>>> communication quality and of getting the amount of slots we wanted, so yes.
>>>
>>
>> hmm, looking here:
>>
>> https://wiki.python.org/moin/SummerOfCode/2016#Sub-orgs
>>
>> it seems it's time to get started. and I _think_ our ideas page can go on
>> that Wiki.
>>
>>
>>> Are you maybe interested in co-organizing or mentoring Chris? Updating
>>> the ideas page, proposal reviewing and interviewing students via video
>>> calls can be time-consuming, and mentoring definitely is, so the more the
>>> merrier.
>>>
>>
>> I would love to help -- though I don't think I can commit to being a
>> full-on mentor.
>>
>> If we get a couple people to agree to mentor,
>>
>
> That's always the tricky part. We normally let people indicate whether
> they're interested in mentoring for specific project ideas on the ideas
> page.
>
>
>> then we can get ourselves setup up with the PSF.
>> <https://mail.scipy.org/mailman/listinfo/numpy-discussion>
>>
>
> That's the easiest part, takes one email and one wiki page edit:)
>
> Ralf
>
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.11.0b2 released

2016-02-01 Thread Ralf Gommers
On Sun, Jan 31, 2016 at 11:57 AM, Julian Taylor <
jtaylor.deb...@googlemail.com> wrote:

> On 01/30/2016 06:27 PM, Ralf Gommers wrote:
> >
> >
> > On Fri, Jan 29, 2016 at 11:39 PM, Nathaniel Smith <n...@pobox.com
> > <mailto:n...@pobox.com>> wrote:
> >
> > It occurs to me that the best solution might be to put together a
> > .travis.yml for the release branches that does: "for pkg in
> > IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg;
> > pkg.test()'"
> > This might not be viable right now, but will be made more viable if
> > pypi starts allowing official Linux wheels, which looks likely to
> > happen before 1.12... (see PEP 513)
> >
> > On Jan 29, 2016 9:46 AM, "Andreas Mueller" <t3k...@gmail.com
> > <mailto:t3k...@gmail.com>> wrote:
> > >
> > > Is this the point when scikit-learn should build against it?
> >
> > Yes please!
> >
> > > Or do we wait for an RC?
> >
> > This is still all in flux, but I think we might actually want a rule
> > that says it can't become an RC until after we've tested
> > scikit-learn (and a list of similarly prominent packages). On the
> > theory that RC means "we think this is actually good enough to
> > release" :-). OTOH I'm not sure the alpha/beta/RC distinction is
> > very helpful; maybe they should all just be betas.
> >
> > > Also, we need a scipy build against it. Who does that?
> >
> > Like Julian says, it shouldn't be necessary. In fact using old
> > builds of scipy and scikit-learn is even better than rebuilding
> > them, because it tests numpy's ABI compatibility -- if you find you
> > *have* to rebuild something then we *definitely* want to know that.
> >
> > > Our continuous integration doesn't usually build scipy or numpy,
> so it will be a bit tricky to add to our config.
> > > Would you run our master tests? [did we ever finish this
> discussion?]
> >
> > We didn't, and probably should... :-)
> >
> > Why would that be necessary if scikit-learn simply tests pre-releases of
> > numpy as you suggested earlier in the thread (with --pre)?
> >
> > There's also https://github.com/MacPython/scipy-stack-osx-testing by the
> > way, which could have scikit-learn and scikit-image added to it.
> >
> > That's two options that are imho both better than adding more workload
> > for the numpy release manager. Also from a principled point of view,
> > packages should test with new versions of their dependencies, not the
> > other way around.
>
>
> It would be nice but its not realistic, I doubt most upstreams that are
> not themselves major downstreams are even subscribed to this list.
>

I'm pretty sure that some core devs from all major scipy stack packages are
subscribed to this list.

Testing or delegating testing of least our major downstreams should be
> the job of the release manager.
>

If we make it (almost) fully automated, like in
https://github.com/MacPython/scipy-stack-osx-testing, then I agree that
adding this to the numpy release checklist would make sense.

But it should really only be a tiny amount of work - we're short on
developer power, and many things that are cross-project like build & test
infrastructure (numpy.distutils, needed pip/packaging fixes,
numpy.testing), scipy.org (the "stack" website), numpydoc, etc. are mostly
maintained by the numpy/scipy devs. I'm very reluctant to say yes to
putting even more work on top of that.

So: it would really help if someone could pick up the automation part of
this and improve the stack testing, so the numpy release manager doesn't
have to do this.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.11.0b2 released

2016-01-30 Thread Ralf Gommers
On Fri, Jan 29, 2016 at 11:39 PM, Nathaniel Smith  wrote:

> It occurs to me that the best solution might be to put together a
> .travis.yml for the release branches that does: "for pkg in
> IMPORTANT_PACKAGES: pip install $pkg; python -c 'import pkg; pkg.test()'"
> This might not be viable right now, but will be made more viable if pypi
> starts allowing official Linux wheels, which looks likely to happen before
> 1.12... (see PEP 513)
>
> On Jan 29, 2016 9:46 AM, "Andreas Mueller"  wrote:
> >
> > Is this the point when scikit-learn should build against it?
> Yes please!
>
> > Or do we wait for an RC?
> This is still all in flux, but I think we might actually want a rule that
> says it can't become an RC until after we've tested scikit-learn (and a
> list of similarly prominent packages). On the theory that RC means "we
> think this is actually good enough to release" :-). OTOH I'm not sure the
> alpha/beta/RC distinction is very helpful; maybe they should all just be
> betas.
>
> > Also, we need a scipy build against it. Who does that?
> Like Julian says, it shouldn't be necessary. In fact using old builds of
> scipy and scikit-learn is even better than rebuilding them, because it
> tests numpy's ABI compatibility -- if you find you *have* to rebuild
> something then we *definitely* want to know that.
>
> > Our continuous integration doesn't usually build scipy or numpy, so it
> will be a bit tricky to add to our config.
> > Would you run our master tests? [did we ever finish this discussion?]
> We didn't, and probably should... :-)
>
> Why would that be necessary if scikit-learn simply tests pre-releases of
numpy as you suggested earlier in the thread (with --pre)?

There's also https://github.com/MacPython/scipy-stack-osx-testing by the
way, which could have scikit-learn and scikit-image added to it.

That's two options that are imho both better than adding more workload for
the numpy release manager. Also from a principled point of view, packages
should test with new versions of their dependencies, not the other way
around.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.11.0b2 released

2016-01-28 Thread Ralf Gommers
On Fri, Jan 29, 2016 at 4:21 AM, Nathaniel Smith <n...@pobox.com> wrote:

> On Jan 28, 2016 3:25 PM, "Ralf Gommers" <ralf.gomm...@gmail.com> wrote:
> >
> >
> >
> > On Thu, Jan 28, 2016 at 11:57 PM, Nathaniel Smith <n...@pobox.com> wrote:
> >>
> >> On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >> >
> >> >
> >> > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris
> >> > <charlesr.har...@gmail.com> wrote:
> >> >>
> >> >>
> >> >>
> >> >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith <n...@pobox.com>
> wrote:
> >> >>>
> >> >>> Maybe we should upload to pypi? This allows us to upload binaries
> for osx
> >> >>> at least, and in general will make the beta available to anyone who
> does
> >> >>> 'pip install --pre numpy'. (But not regular 'pip install numpy',
> because pip
> >> >>> is clever enough to recognize that this is a prerelease and should
> not be
> >> >>> used by default.)
> >> >>>
> >> >>> (For bonus points, start a campaign to convince everyone to add
> --pre to
> >> >>> their ci setups, so that merely uploading a prerelease will ensure
> that it
> >> >>> starts getting tested automatically.)
> >> >>>
> >> >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" <
> charlesr.har...@gmail.com>
> >> >>> wrote:
> >> >>>>
> >> >>>> Hi All,
> >> >>>>
> >> >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The
> first
> >> >>>> beta was a damp squib due to missing files in the released source
> files,
> >> >>>> this release fixes that. The new source filese may be downloaded
> from
> >> >>>> sourceforge, no binaries will be released until the mingw tool
> chain
> >> >>>> problems are sorted.
> >> >>>>
> >> >>>> Please test and report any problem.
> >> >>
> >> >>
> >> >> So what happens if I use twine to upload a beta? Mind, I'd give it a
> try
> >> >> if pypi weren't an irreversible machine of doom.
> >> >
> >> >
> >> > One of the things that will probably happen but needs to be avoided
> is that
> >> > 1.11b2 becomes the visible release at
> https://pypi.python.org/pypi/numpy. By
> >> > default I think the status of all releases but the last uploaded one
> (or
> >> > highest version number?) is set to hidden.
> >>
> >> Huh, I had the impression that if it was ambiguous whether the "latest
> >> version" was a pre-release or not, then pypi would list all of them on
> >> that page -- at least I know I've seen projects where going to the
> >> main pypi URL gives a list of several versions like that. Or maybe the
> >> next-to-latest one gets hidden by default and you're supposed to go
> >> back and "un-hide" the last release manually.
> >>
> >> Could try uploading to
> >>
> >>   https://testpypi.python.org/pypi
> >>
> >> and see what happens...
> >
> >
> > That's worth a try, would be good to know what the behavior is.
> >
> >>
> >>
> >> > Other ways that users can get a pre-release by accident are:
> >> > - they have pip <1.4 (released in July 2013)
> >>
> >> It looks like ~a year ago this was ~20% of users --
> >> https://caremad.io/2015/04/a-year-of-pypi-downloads/
> >> I wouldn't be surprised if it dropped quite a bit since then, but if
> >> this is something that will affect our decision then we can ping
> >> @dstufft to ask for updated numbers.
> >
> >
> > Hmm, that's more than I expected. Even if it dropped by a factor of 10
> over the last year, that would still be a lot of failed installs for the
> current beta1. It looks to me like this is a bad trade-off. It would be
> much better to encourage people to test against numpy master instead of a
> pre-release (and we were trying to do that anyway). So the benefit is then
> fairly limited, mostly typing the longer line including wheels.scipy.org
> when someone wants to test a pre-release.
>
> After the disastrous lack of testing for the 1.10 prereleases, it
> might almost be a g

Re: [Numpy-discussion] Numpy 1.11.0b2 released

2016-01-28 Thread Ralf Gommers
On Thu, Jan 28, 2016 at 11:57 PM, Nathaniel Smith <n...@pobox.com> wrote:

> On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >
> >
> > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris
> > <charlesr.har...@gmail.com> wrote:
> >>
> >>
> >>
> >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith <n...@pobox.com> wrote:
> >>>
> >>> Maybe we should upload to pypi? This allows us to upload binaries for
> osx
> >>> at least, and in general will make the beta available to anyone who
> does
> >>> 'pip install --pre numpy'. (But not regular 'pip install numpy',
> because pip
> >>> is clever enough to recognize that this is a prerelease and should not
> be
> >>> used by default.)
> >>>
> >>> (For bonus points, start a campaign to convince everyone to add --pre
> to
> >>> their ci setups, so that merely uploading a prerelease will ensure
> that it
> >>> starts getting tested automatically.)
> >>>
> >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" <
> charlesr.har...@gmail.com>
> >>> wrote:
> >>>>
> >>>> Hi All,
> >>>>
> >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first
> >>>> beta was a damp squib due to missing files in the released source
> files,
> >>>> this release fixes that. The new source filese may be downloaded from
> >>>> sourceforge, no binaries will be released until the mingw tool chain
> >>>> problems are sorted.
> >>>>
> >>>> Please test and report any problem.
> >>
> >>
> >> So what happens if I use twine to upload a beta? Mind, I'd give it a try
> >> if pypi weren't an irreversible machine of doom.
> >
> >
> > One of the things that will probably happen but needs to be avoided is
> that
> > 1.11b2 becomes the visible release at https://pypi.python.org/pypi/numpy.
> By
> > default I think the status of all releases but the last uploaded one (or
> > highest version number?) is set to hidden.
>
> Huh, I had the impression that if it was ambiguous whether the "latest
> version" was a pre-release or not, then pypi would list all of them on
> that page -- at least I know I've seen projects where going to the
> main pypi URL gives a list of several versions like that. Or maybe the
> next-to-latest one gets hidden by default and you're supposed to go
> back and "un-hide" the last release manually.
>
> Could try uploading to
>
>   https://testpypi.python.org/pypi
>
> and see what happens...
>

That's worth a try, would be good to know what the behavior is.


>
> > Other ways that users can get a pre-release by accident are:
> > - they have pip <1.4 (released in July 2013)
>
> It looks like ~a year ago this was ~20% of users --
> https://caremad.io/2015/04/a-year-of-pypi-downloads/
> I wouldn't be surprised if it dropped quite a bit since then, but if
> this is something that will affect our decision then we can ping
> @dstufft to ask for updated numbers.
>

Hmm, that's more than I expected. Even if it dropped by a factor of 10 over
the last year, that would still be a lot of failed installs for the current
beta1. It looks to me like this is a bad trade-off. It would be much better
to encourage people to test against numpy master instead of a pre-release
(and we were trying to do that anyway). So the benefit is then fairly
limited, mostly typing the longer line including wheels.scipy.org when
someone wants to test a pre-release.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.11.0b2 released

2016-01-28 Thread Ralf Gommers
On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris <
charlesr.har...@gmail.com> wrote:

>
>
> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith  wrote:
>
>> Maybe we should upload to pypi? This allows us to upload binaries for osx
>> at least, and in general will make the beta available to anyone who does
>> 'pip install --pre numpy'. (But not regular 'pip install numpy', because
>> pip is clever enough to recognize that this is a prerelease and should not
>> be used by default.)
>>
>> (For bonus points, start a campaign to convince everyone to add --pre to
>> their ci setups, so that merely uploading a prerelease will ensure that it
>> starts getting tested automatically.)
>> On Jan 28, 2016 12:51 PM, "Charles R Harris" 
>> wrote:
>>
>>> Hi All,
>>>
>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The first
>>> beta was a damp squib due to missing files in the released source files,
>>> this release fixes that. The new source filese may be downloaded from
>>> sourceforge, no binaries will be released until the mingw tool chain
>>> problems are sorted.
>>>
>>> Please test and report any problem.
>>>
>>
> So what happens if I use twine to upload a beta? Mind, I'd give it a try
> if pypi weren't an irreversible machine of doom.
>

One of the things that will probably happen but needs to be avoided is that
1.11b2 becomes the visible release at https://pypi.python.org/pypi/numpy.
By default I think the status of all releases but the last uploaded one (or
highest version number?) is set to hidden.

Other ways that users can get a pre-release by accident are:
- they have pip <1.4 (released in July 2013)
- other packages have a requirement on numpy with a prerelease version (see
https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions)

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Bump warning stacklevel

2016-01-27 Thread Ralf Gommers
On Wed, Jan 27, 2016 at 11:02 PM, sebastian <sebast...@sipsolutions.net>
wrote:

> On 2016-01-27 21:01, Ralf Gommers wrote:
>
>>
>> One issue will be how to keep this consistent. `stacklevel` is used so
>> rarely that new PRs will always omit it for new warnings. Will we just
>> rely on code review, or would a private wrapper around `warn` to use
>> inside numpy plus a test that checks that the wrapper is used
>> everywhere be helpful here?
>>
>>
> Yeah, I mean you could add tests for the individual functions in principle.
> I am not sure if adding an alias helps much, how are we going to test that
> warnings.warn is not being used? Seems like quite a bit of voodoo necessary
> for that.
>

I was thinking something along these lines, but with a regexp checking for
warnings.warn:
https://github.com/scipy/scipy/blob/master/scipy/fftpack/tests/test_import.py

Probably more trouble than it's worth though.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Bump warning stacklevel

2016-01-27 Thread Ralf Gommers
On Wed, Jan 27, 2016 at 7:26 PM, Sebastian Berg 
wrote:

> Hi all,
>
> in my PR about warnings suppression, I currently also have a commit
> which bumps the warning stacklevel to two (or three), i.e. use:
>
> warnings.warn(..., stacklevel=2)
>
> (almost) everywhere. This means that for example (take only the empty
> warning):
>
> np.mean([])
>
> would not print:
>
> /usr/lib/python2.7/dist-packages/numpy/core/_methods.py:55:
> RuntimeWarning: Mean of empty slice.
>   warnings.warn("Mean of empty slice.", RuntimeWarning)
>
> but instead print the actual `np.mean([])` code line (the repetition of
> the warning command is always a bit funny).
>
> The advantage is nicer printing for the user.
>
> The disadvantage would probably mostly be that existing warning filters
> that use the `module` keyword argument, will fail.
>
> Any objections/thoughts about doing this change to try to better report
> the offending code line?


This has annoyed me for a long time, it's hard now to figure out where
warnings really come from. Especially when running something large like
scipy.test(). So +1.


> Frankly, I am not sure whether there might be
> a python standard about this, but I would expect that for a library
> such as numpy, it makes sense to change. But, if downstream uses
> warning filters with modules, we might want to reconsider for example.
>

There probably are usages of `module`, but I'd expect that it's used a lot
less than `category` or `message`. A quick search through the scipy repo
gave me only a single case where `module` was used, and that's in
deprecated weave code so soon the count is zero.  Also, even for relevant
usage, nothing will break in a bad way - some more noise or a spurious test
failure in numpy-using code isn't the end of the world I'd say.

One issue will be how to keep this consistent. `stacklevel` is used so
rarely that new PRs will always omit it for new warnings. Will we just rely
on code review, or would a private wrapper around `warn` to use inside
numpy plus a test that checks that the wrapper is used everywhere be
helpful here?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] Numpy 1.11.0b1 is out

2016-01-27 Thread Ralf Gommers
On Wed, Jan 27, 2016 at 4:47 AM, Charles R Harris  wrote:

>
>
> On Tue, Jan 26, 2016 at 7:45 PM, Charles R Harris <
> charlesr.har...@gmail.com> wrote:
>
>>
>>
>> On Tue, Jan 26, 2016 at 7:13 PM, Charles R Harris <
>> charlesr.har...@gmail.com> wrote:
>>
>>>
>>>
>>> On Tue, Jan 26, 2016 at 6:58 PM, Charles R Harris <
>>> charlesr.har...@gmail.com> wrote:
>>>


 On Tue, Jan 26, 2016 at 4:48 PM, Derek Homeier <
 de...@astro.physik.uni-goettingen.de> wrote:

> Hi Chuck,
>
> > I'm pleased to announce that Numpy 1.11.0b1 is now available on
> sourceforge. This is a source release as the mingw32 toolchain is broken.
> Please test it out and report any errors that you discover. Hopefully we
> can do better with 1.11.0 than we did with 1.10.0 ;)
>
> the tarball seems to be incomplete, hope that does not bode ill ;-)
>
>   adding
> 'build/src.macosx-10.10-x86_64-2.7/numpy/core/include/numpy/_numpyconfig.h'
> to sources.
> executing numpy/core/code_generators/generate_numpy_api.py
> error: [Errno 2] No such file or directory:
> 'numpy/core/code_generators/../src/multiarray/arraytypes.c.src'
>

 Grr, yes indeed, `paver sdist` doesn't do the right thing, none of the
 `multiarray/*.c.src` files are included, but it works fine in 1.10.x. The
 changes are minimal, the only thing that would seem to matter is the
 removal of setupegg.py. Ralf, any ideas.


> > tar tvf /sw/src/numpy-1.11.0b1.tar.gz |grep arraytypes
> -rw-rw-r-- charris/charris   62563 2016-01-21 20:38
> numpy-1.11.0b1/numpy/core/include/numpy/ndarraytypes.h
> -rw-rw-r-- charris/charris 981 2016-01-21 20:38
> numpy-1.11.0b1/numpy/core/src/multiarray/arraytypes.h
>
> FWIW, the maintenance/1.11.x branch (there is no tag for the beta?)
> builds and passes all tests with Python 2.7.11
> and 3.5.1 on  Mac OS X 10.10.
>
>
 You probably didn't fetch the tags, if they can't be reached from the
 branch head they don't download automatically. Try `git fetch --tags
 upstream`


>>> setupegg.py doesn't seem to matter...
>>>
>>>
>> OK, it is the changes in the root setup.py file, probably the switch to
>> setuptools.
>>
>
> Yep, it's setuptools. If I import sdist from distutils instead, everything
> works fine.
>

Setuptools starts build_src, don't know why yet but it doesn't look to me
like it should be doing that. Issue for more detailed discussion:
https://github.com/numpy/numpy/issues/7127

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Building Numpy with OpenBLAS

2016-01-27 Thread Ralf Gommers
On Wed, Jan 27, 2016 at 3:19 PM, G Young  wrote:

> NumPy will "build" successfully, but then when I type "import numpy", it
> cannot import the multiarray PYD file.
>
> I am using dependency walker, and that's how I know it's the
> libopenblas.dll file that it's not linking to properly, hence my original
> question.
>

The support for MingwPy in numpy.distutils had to be temporarily reverted
(see https://github.com/numpy/numpy/pull/6536), because the patch caused
other issues. So likely it just won't work out of the box now. If you need
it, maybe you can reapply that reverted patch. But otherwise I'd wait a
little bit; we'll sort out the MingwPy build in the near future.

Ralf



> Greg
>
> On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan 
> wrote:
>
>> When you say find/use, can you please clarify whether you have completed
>> the compilation/linking successfully?  I'm not clear on exactly when you're
>> having problems.  What is the error output?
>>
>> One very helpful tool in diagnosing dll problems is dependency walker:
>> http://www.dependencywalker.com/
>>
>> It may be that your openblas has a dependency that it can't load for some
>> reason.  Dependency walker works on .pyd files as well as .dll files.
>>
>> Hth,
>> Michael
>>
>> On Wed, Jan 27, 2016, 07:40 G Young  wrote:
>>
>>> I do have my site.cfg file pointing to my library which contains a .lib
>>> file along with the appropriate include_dirs parameter.  However, NumPy
>>> can't seem to find / use the DLL file no matter where I put it (numpy/core,
>>> same directory as openblas.lib).  By the way, I should mention that I am
>>> using a slightly dated version of OpenBLAS (0.2.9), but that shouldn't have
>>> any effect I would imagine.
>>>
>>> Greg
>>>
>>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan 
>>> wrote:
>>>
 I'm not sure about the mingw tool chain, but usually on windows at link
 time you need a .lib file, called the import library.  The .dll is used at
 runtime, not at link time.  This is different from *nix, where the .so
 serves both purposes.  The link you posted mentions import files, so I hope
 this is helpful information.

 Best,
 Michael

 On Wed, Jan 27, 2016, 03:39 G Young  wrote:

> Hello all,
>
> I'm trying to update the documentation for building Numpy from source,
> and I've hit a brick wall in trying to build the library using OpenBLAS
> because I can't seem to link the libopenblas.dll file.  I tried following
> the suggestion of placing the DLL in numpy/core as suggested here
>  but
> it still doesn't pick it up.  What am I doing wrong?
>
> Thanks,
>
> Greg
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>

 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 https://mail.scipy.org/mailman/listinfo/numpy-discussion


>>> ___
>>> NumPy-Discussion mailing list
>>> NumPy-Discussion@scipy.org
>>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>>>
>>
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@scipy.org
>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>>
>>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Building Numpy with OpenBLAS

2016-01-27 Thread Ralf Gommers
On Wed, Jan 27, 2016 at 3:51 PM, G Young <gfyoun...@gmail.com> wrote:

> I don't need it at this point.  I'm just going through the exercise for
> purposes of updating building from source on Windows.  But that's good to
> know though.  Thanks!
>

That effort is much appreciated by the way. Updating the build info on all
platforms on http://scipy.org/scipylib/building/index.html is a significant
amount of work, and it has never been in a state one could call complete.
So more contributions definitely welcome!

Ralf



>
> Greg
>
> On Wed, Jan 27, 2016 at 2:48 PM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
>
>>
>>
>> On Wed, Jan 27, 2016 at 3:19 PM, G Young <gfyoun...@gmail.com> wrote:
>>
>>> NumPy will "build" successfully, but then when I type "import numpy", it
>>> cannot import the multiarray PYD file.
>>>
>>> I am using dependency walker, and that's how I know it's the
>>> libopenblas.dll file that it's not linking to properly, hence my original
>>> question.
>>>
>>
>> The support for MingwPy in numpy.distutils had to be temporarily reverted
>> (see https://github.com/numpy/numpy/pull/6536), because the patch caused
>> other issues. So likely it just won't work out of the box now. If you need
>> it, maybe you can reapply that reverted patch. But otherwise I'd wait a
>> little bit; we'll sort out the MingwPy build in the near future.
>>
>> Ralf
>>
>>
>>
>>> Greg
>>>
>>> On Wed, Jan 27, 2016 at 1:58 PM, Michael Sarahan <msara...@gmail.com>
>>> wrote:
>>>
>>>> When you say find/use, can you please clarify whether you have
>>>> completed the compilation/linking successfully?  I'm not clear on exactly
>>>> when you're having problems.  What is the error output?
>>>>
>>>> One very helpful tool in diagnosing dll problems is dependency walker:
>>>> http://www.dependencywalker.com/
>>>>
>>>> It may be that your openblas has a dependency that it can't load for
>>>> some reason.  Dependency walker works on .pyd files as well as .dll files.
>>>>
>>>> Hth,
>>>> Michael
>>>>
>>>> On Wed, Jan 27, 2016, 07:40 G Young <gfyoun...@gmail.com> wrote:
>>>>
>>>>> I do have my site.cfg file pointing to my library which contains a
>>>>> .lib file along with the appropriate include_dirs parameter.  However,
>>>>> NumPy can't seem to find / use the DLL file no matter where I put it
>>>>> (numpy/core, same directory as openblas.lib).  By the way, I should 
>>>>> mention
>>>>> that I am using a slightly dated version of OpenBLAS (0.2.9), but that
>>>>> shouldn't have any effect I would imagine.
>>>>>
>>>>> Greg
>>>>>
>>>>> On Wed, Jan 27, 2016 at 1:14 PM, Michael Sarahan <msara...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I'm not sure about the mingw tool chain, but usually on windows at
>>>>>> link time you need a .lib file, called the import library.  The .dll is
>>>>>> used at runtime, not at link time.  This is different from *nix, where 
>>>>>> the
>>>>>> .so serves both purposes.  The link you posted mentions import files, so 
>>>>>> I
>>>>>> hope this is helpful information.
>>>>>>
>>>>>> Best,
>>>>>> Michael
>>>>>>
>>>>>> On Wed, Jan 27, 2016, 03:39 G Young <gfyoun...@gmail.com> wrote:
>>>>>>
>>>>>>> Hello all,
>>>>>>>
>>>>>>> I'm trying to update the documentation for building Numpy from
>>>>>>> source, and I've hit a brick wall in trying to build the library using
>>>>>>> OpenBLAS because I can't seem to link the libopenblas.dll file.  I tried
>>>>>>> following the suggestion of placing the DLL in numpy/core as suggested
>>>>>>> here
>>>>>>> <https://github.com/numpy/numpy/wiki/Mingw-static-toolchain#notes> but
>>>>>>> it still doesn't pick it up.  What am I doing wrong?
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>> Greg
>>>>>>> ___
>>>>>>> NumPy-Discussion mailing list
>>>>

Re: [Numpy-discussion] Appveyor Testing Changes

2016-01-26 Thread Ralf Gommers
On Tue, Jan 26, 2016 at 2:13 AM, G Young  wrote:

> Ah, yes, that is true.  That point had completely escaped my mind.  In
> light of this, it seems that it's not worth the while then to completely
> switch over to pip + virtualenv.  It's might be better actually to rewrite
> the current Appveyor tests to use environments so that the test suite can
> be expanded, though I'm not sure how prudent that is given how slow
> Appveyor tests run.
>

At the moment Appveyor is already a bit of a bottleneck - it regularly
hasn't started yet when TravisCI is already done. This can be solved via a
paid account, we should seriously consider that when we have a bit more
experience with it (Appveyor tests have been running for less than a month
I think). But it does mean we should go for a sparse test matrix, and use a
more complete one (all Python versions for example) on TravisCI. In the
near future we'll have to add MingwPy test runs to Appveyor. Beyond that
I'm not sure what needs to be added?

Ralf



>
> Greg
>
> On Tue, Jan 26, 2016 at 12:13 AM, Bryan Van de Ven 
> wrote:
>
>>
>> > On Jan 25, 2016, at 5:21 PM, G Young  wrote:
>> >
>> > With regards to testing numpy, both Conda and Pip + Virtualenv work
>> quite well.  I have used both to install master and run unit tests, and
>> both pass with flying colors.  This chart here illustrates my point nicely
>> as well.
>> >
>> > However, I can't seem to find / access Conda installations for slightly
>> older versions of Python (e.g. Python 3.4).  Perhaps this is not much of an
>> issue now with the next release (1.12) being written only for Python 2.7
>> and Python 3.4 - 5.  However, if we were to wind the clock slightly back to
>> when we were testing 2.6 - 7, 3.2 - 5, I feel Conda falls short in being
>> able to test on a variety of Python distributions given the nature of Conda
>> releases.  Maybe that situation is no longer the case now, but in the long
>> term, it could easily happen again.
>>
>> Why do you need the installers? The whole point of conda is to be able to
>> create environments with whatever configuration you need. Just pick the
>> newest installer and use "conda create" from there:
>>
>> bryan@0199-bryanv (git:streaming) ~/work/bokeh/bokeh $ conda create -n
>> py26 python=2.6
>> Fetching package metadata: ..
>> Solving package specifications: ..
>> Package plan for installation in environment
>> /Users/bryan/anaconda/envs/py26:
>>
>> The following packages will be downloaded:
>>
>> package|build
>> ---|-
>> setuptools-18.0.1  |   py26_0 343 KB
>> pip-7.1.0  |   py26_0 1.4 MB
>> 
>>Total: 1.7 MB
>>
>> The following NEW packages will be INSTALLED:
>>
>> openssl:1.0.1k-1
>> pip:7.1.0-py26_0
>> python: 2.6.9-1
>> readline:   6.2-2
>> setuptools: 18.0.1-py26_0
>> sqlite: 3.9.2-0
>> tk: 8.5.18-0
>> zlib:   1.2.8-0
>>
>> Proceed ([y]/n)?
>>
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@scipy.org
>> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>>
>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Should I use pip install numpy in linux?

2016-01-19 Thread Ralf Gommers
On Tue, Jan 19, 2016 at 5:57 PM, Chris Barker - NOAA Federal <
chris.bar...@noaa.gov> wrote:

>
> > 2) continue to support those users fairly poorly, and at substantial
> > ongoing cost
>
> I'm curious what the cost is for this poor support -- throw the source
> up on PyPi, and we're done. The cost comes in when trying to build
> binaries...
>

I'm sure Nathaniel means the cost to users of failed installs and of numpy
losing users because of that, not the cost of building binaries.

> Option 1 would require overwhelming consensus of the community, which
> > for better or worse is presumably not going to happen while
> > substantial portions of that community are still using pip/PyPI.
>
> Are they? Which community are we talking about? The community I'd like
> to target are web developers that aren't doing what they think of as
> "scientific" applications, but could use a little of the SciPy stack.
> These folks are committed to pip, and are very reluctant to introduce
> a difficult dependency.  Binary wheels would help these folks, but
> that is not a community that exists yet ( or it's small, anyway)
>
> All that being said, I'd be happy to see binary wheels for the core
> SciPy stack on PyPi. It would be nice for people to be able to do a
> bit with Numpy or pandas, it MPL, without having to jump ship to a
> whole new way of doing things.
>

This is indeed exactly why we need binary wheels. Efforts to provide those
will not change our strong recommendation to our users that they're better
off using a scientific Python distribution.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] building with setuptools

2016-01-18 Thread Ralf Gommers
Hi all,

We've been able to hold out for a long time, but this is to let you all
know that we have now had to switch the numpy build to rely on setuptools:
https://github.com/numpy/numpy/pull/6895.

Main reason: plain distutils doesn't support PEP 440 naming. See
https://github.com/numpy/numpy/issues/6257 and
https://github.com/numpy/numpy/issues/6551 for details.

So if you see different build weirdness on master, it's likely due to this
change.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Windows build/distribute plan & MingwPy funding

2016-01-09 Thread Ralf Gommers
On Mon, Jan 4, 2016 at 7:38 PM, Matthew Brett <matthew.br...@gmail.com>
wrote:

> Hi,
>
> On Mon, Jan 4, 2016 at 5:35 PM, Erik Bray <erik.m.bray+nu...@gmail.com>
> wrote:
> > On Sat, Jan 2, 2016 at 3:20 AM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >> Hi all,
> >>
> >> You probably know that building Numpy, Scipy and the rest of the Scipy
> Stack
> >> on Windows is problematic. And that there are plans to adopt the static
> >> MinGW-w64 based toolchain that Carl Kleffner has done a lot of work on
> for
> >> the last two years to fix that situation.
> >>
> >> The good news is: this has become a lot more concrete just now, with
> this
> >> proposal for funding:
> http://mingwpy.github.io/proposal_december2015.html
> >>
> >> Funding for phases 1 and 2 is already confirmed; the phase 3 part has
> been
> >> submitted to the PSF. Phase 1 (of $1000) is funded by donations made to
> >> Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS
> >> directly. So a big thank you to everyone who made a donation to Numpy,
> Scipy
> >> and NumFOCUS!
>

More good news: the PSF has approved phase 3!


>
> >> I hope that that proposal gives a clear idea of the work that's going
> to be
> >> done over the next months. Note that the http://mingwpy.github.io
> contains a
> >> lot more background info, description of technical issues, etc.
> >>
> >> Feedback & ideas very welcome of course!
> >>
> >> Cheers,
> >> Ralf
> >
> > Hi Ralph,
> >
> > I've seen you drop hints about this recently, and am interested to
> > follow this work.  I've been hired as part of the OpenDreamKit project
> > to work, in large part, on developing a sensible toolchain for
> > building and distributing Sage on Windows.  I know you've been
> > following the thread on that too.  Although the primary goal there is
> > "whatever works", I'm personally inclined to focus on the mingwpy /
> > mingw-w64 approach, due in large part with my past success with the
> > MinGW 32-bit toolchain.
>

That's good to hear Erik.


> (I personally have a desire to improve
> > support for building with MSVC as well, but that's a less important
> > goal as far as the funding is concerned.)
>
>
> > So anyways, please keep me in the loop about this, as I will also be
> > putting effort into this over the next year as well.  Has there been
> > any discussion about setting up a mailing list specifically for this
> > project?
>

We'll definitely keep you in the loop. We try to do as much of the
discussion as possible on the mingwpy mailing list and on
https://github.com/mingwpy. If there's significant progress then I guess
that'll be announced on this list as well.

Cheers,
Ralf


> Yes, it exists already, but not well advertised :
> https://groups.google.com/forum/#!forum/mingwpy
>
> It would be great to share work.
>
> Cheers,
>
> Matthew
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.10.3 release.

2016-01-06 Thread Ralf Gommers
On Wed, Jan 6, 2016 at 8:30 AM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:

>
>
>
> On Mon, Jan 4, 2016 at 9:38 PM, Charles R Harris <
> charlesr.har...@gmail.com> wrote:
>>
>>
>> The failed tests require pyrex, fortran, and swig. The refcount error
>> comes and goes, probably the test isn't very good.  Ralf, is there any
>> reason to keep the various extension building tests? They are very old.
>>
>
> There are f2py tests that are useful - if they fail then building anything
> else with f2py fails as well. However those are all in f2py/tests/ IRRC,
> not in distutils/tests.
>
>>
>> I don't know what the recommendation on using nosetests is,
>>
>
> Don't use it (it definitely gives test errors due to not having the
> knownfailure plugin), use either runtests.py or numpy.test().
>
>
>> but probably it is finding too many tests.
>>
>
> It's doing its job here in finding these though. It looks like these tests
> need either fixing (the extensions aren't built) or removing. I'll have a
> look tonight.
>

Should be fixed by https://github.com/numpy/numpy/pull/6955

`nosetests numpy` will still give test errors (6 right now), but only due
to not understanding KnownFailureException.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.10.3 release.

2016-01-05 Thread Ralf Gommers
On Mon, Jan 4, 2016 at 9:38 PM, Charles R Harris 
wrote:
>
>
> The failed tests require pyrex, fortran, and swig. The refcount error
> comes and goes, probably the test isn't very good.  Ralf, is there any
> reason to keep the various extension building tests? They are very old.
>

There are f2py tests that are useful - if they fail then building anything
else with f2py fails as well. However those are all in f2py/tests/ IRRC,
not in distutils/tests.

>
> I don't know what the recommendation on using nosetests is,
>

Don't use it (it definitely gives test errors due to not having the
knownfailure plugin), use either runtests.py or numpy.test().


> but probably it is finding too many tests.
>

It's doing its job here in finding these though. It looks like these tests
need either fixing (the extensions aren't built) or removing. I'll have a
look tonight.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Windows build/distribute plan & MingwPy funding

2016-01-02 Thread Ralf Gommers
Hi all,

You probably know that building Numpy, Scipy and the rest of the Scipy
Stack on Windows is problematic. And that there are plans to adopt the
static MinGW-w64 based toolchain that Carl Kleffner has done a lot of work
on for the last two years to fix that situation.

The good news is: this has become a lot more concrete just now, with this
proposal for funding: http://mingwpy.github.io/proposal_december2015.html

Funding for phases 1 and 2 is already confirmed; the phase 3 part has been
submitted to the PSF. Phase 1 (of $1000) is funded by donations made to
Numpy and Scipy (through NumFOCUS), and phase 2 (of $4000) by NumFOCUS
directly. So a big thank you to everyone who made a donation to Numpy,
Scipy and NumFOCUS!

I hope that that proposal gives a clear idea of the work that's going to be
done over the next months. Note that the http://mingwpy.github.io contains
a lot more background info, description of technical issues, etc.

Feedback & ideas very welcome of course!

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy funding update

2016-01-02 Thread Ralf Gommers
On Fri, Jan 1, 2016 at 12:17 AM, Robert Kern <robert.k...@gmail.com> wrote:

> On Wed, Dec 30, 2015 at 10:54 AM, Ralf Gommers <ralf.gomm...@gmail.com>
> wrote:
> >
> > Hi all,
> >
> > A quick good news message: OSDC has made a $5k contribution to NumFOCUS,
> which is split between support for a women in technology workshop and
> support for Numpy:
> http://www.numfocus.org/blog/osdc-donates-5k-to-support-numpy-women-in-tech
> > This was a very nice surprise to me, and a first sign that the FSA
> (fiscal sponsorship agreement) we recently signed with NumFOCUS is going to
> yield significant benefits for Numpy.
> >
> > NumFOCUS is also doing a special end-of-year fundraiser. Funds donated
> (up to $5k) will be tripled by anonymous sponsors:
> http://www.numfocus.org/blog/numfocus-end-of-year-fundraising-drive-5000-matching-gift-challenge
> > So think of Numpy (or your other favorite NumFOCUS-sponsored project of
> course) if you're considering a holiday season charitable gift!
>
> That sounds great! Do we have any concrete plans for spending that money,
> yet?
>

There are:
  1. developer meetings
  2. MingwPy
  3. TBD - Numpy code/infrastructure work

Some initial plans were discussed and documented during the first Numpy
developer meeting at Scipy'15. See wiki page [1] and meeting minutes [2].

Funding for facilitating more developer meetings (~1 a year) is definitely
on the radar. Plans for funding development of new features or other
code/infrastructure work are still fuzzy (except for MingwPy, see my email
of today on that).

We need to transfer the relevant part of those meeting minutes [2] into the
docs, and expand upon those to get a clear roadmap online that we can build
on. This is work for the next months I think.

Cheers,
Ralf

[1] https://github.com/numpy/numpy/wiki/SciPy-2015-developer-meeting
[2]
https://docs.google.com/document/d/1IJcYdsHtk8MVAM4AZqFDBSf_nVG-mrB4Tv2bh9u1g4Y/edit?usp=sharing
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] random integers

2015-12-30 Thread Ralf Gommers
On Thu, Dec 31, 2015 at 6:31 AM, Charles R Harris  wrote:

> Hi All,
>
> I've implemented several new random integer functions in #6910
> , to wit
>
>
>- np.random.random_int32
>- np.random.random_int64
>- np.random.random_intp
>
> These are the minimum functions that I think we need for the numpy 1.11.0
> release, most especially the random_intp function for fuzz testing the
> mem_overlap functions. However, there is the question of the best way to
> expose the functions. Currently, they are all separately exposed, but it
> would also be possible to expose them through a new dtype argument to the
> current np.random.random_integers function. Note that all all the new
> functions would still be there, but they could be hidden as private
> functions. Also, there is the option of adding a complete set comprising
> booleans, int8, int16, and the unsigned versions. So the two, not mutually
> exclusive, proposed enhancements are
>
>- expose the new functions through a dtype argument to
>random_integers, hide the other functions
>
>
+1 for a single new keyword only and hiding the rest. There's already
random.randint and random.random_integers (keyword should be added to both
of those). That's already one function too many. Adding even more functions
would be very weird.

>
>-
>- expose the new functions through a dtype argument to
>random_integers, not hide the other functions
>- make a complete set of random integer types
>
> There is currently no easy way to specify the complete range, so a
> proposal for that would be to generate random numbers over the full
> possible range of the type if no arguments are specified. That seems like a
> fairly natural extension.
>

I don't understand this point, low/high keywords explicitly say that they
use the full available range?


> Finally, there is also a proposal to allow broadcasting/element wise
> selection of the range. This is the most complicated of the proposed
> enhancements and I am not really in favor, but it would be good to hear
> from others.
>
I don't see much of a use-case. Broadcasting multiple keywords together is
tricky to implement and use. So for the few users that may need this, a
small for loop + array stack should get their job done right?

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Numpy funding update

2015-12-30 Thread Ralf Gommers
Hi all,

A quick good news message: OSDC has made a $5k contribution to NumFOCUS,
which is split between support for a women in technology workshop and
support for Numpy:
http://www.numfocus.org/blog/osdc-donates-5k-to-support-numpy-women-in-tech
This was a very nice surprise to me, and a first sign that the FSA (fiscal
sponsorship agreement) we recently signed with NumFOCUS is going to yield
significant benefits for Numpy.

NumFOCUS is also doing a special end-of-year fundraiser. Funds donated (up
to $5k) will be tripled by anonymous sponsors:
http://www.numfocus.org/blog/numfocus-end-of-year-fundraising-drive-5000-matching-gift-challenge
So think of Numpy (or your other favorite NumFOCUS-sponsored project of
course) if you're considering a holiday season charitable gift!

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


  1   2   3   4   5   6   7   8   9   10   >