On Tue, Jul 14, 2015 at 10:13 PM Sturla Molden
wrote:
> Eric Firing wrote:
>
> > I'm curious: has anyone been looking into what it would take to enable
> > f2py to handle modern Fortran in general? And into prospects for
> > getting such an effort
On Thu, Dec 3, 2015 at 4:07 PM David Verelst
wrote:
> Hi,
>
> For the wafo [1] package we are trying to include the extension
> compilation process in setup.py [2] by using setuptools and
> numpy.distutils [3]. Some of the extensions have one Fortran interface
> source
Tried to figure out in-place calculation for the neighbor routine that I
recently submitted to numpy, but got nowhere. See
https://github.com/numpy/numpy/pull/303 for what I came up with. It
currently makes a new array to hold the calculations.
If someone does come up with something - I would
Docstrings are not stored in .rst files but in the numpy sources, so
there are some non-trivial technical and workflow details missing here. But
besides that, I think translating everything (even into a single language)
is a massive amount of work, and it's not at all clear if there's enough
Are you thinking only about documentation in .rst files (like the
tutorials), or also the docstrings themselves? The former may be feasible,
the latter I think will be difficult.
Everything. Within the documentation editor the RST docstrings are parsed
from the functions, so instead of only
I have thought for a long time that it would be nice to have numpy/scipy
docs in multiple languages. I didn't have any idea how to do it until I
saw http://sphinx.pocoo.org/intl.html. The gettext builder which is a
requirement to make this happen is relatively new to sphinx.
Outline of above
On Sat, May 19, 2012 at 8:16 PM, Nathaniel Smith n...@pobox.com wrote:
help() just returns the __doc__ attribute, but a large number of numpy's
__doc__ attributes are set up by code at import time, so in principle even
these could be run through gettext pretty easily.
I didn't know that. I
On Fri, May 18, 2012 at 5:49 PM, Chao YUE chaoyue...@gmail.com wrote:
Previously I have installed numpy 1.5.1. and then I used pip install
--upgrade numpy
to install numpy 1.6.1
Why was the old 1.5.1 installation in /usr/lib/pymodules/python2.7?
I have in the past used 'pip uninstall
I think we should change the roles established for the Numpy/Scipy
documentation editors because they do not work as intended.
For reference they are described here:
http://docs.scipy.org/numpy/Front%20Page/
Basically there aren't that many active people to support being split into
the roles as
I have never found mailing lists good places for discussion and consensus.
I think the format itself does not lend itself to involvement, carefully
considered (or the ability to change) positions, or voting since all of it
can be so easily lost within all of the quoting, the back and forth,
Use 'ma.max' instead of 'np.max'. This might be a bug OR an undocumented
feature. :-)
import numpy.ma as ma
marr = ma.array(range(10), mask=[0,0,0,0,0,1,1,1,1,1])
np.max(marr)
4 # mask is used
a = []
a.append(marr)
a.append(marr)
np.max(a)
It looks like f2py cannot find libaf90math.so, located in
/opt/absoft10.1/shlib. How can I tell f2py where af90math is?
Really you have to have this setup in order to run a fortran executable,
but the only thing that comes to mind is the LD_LIBRARY_PATH environment
variable. LD_LIBRARY_PATH
On Mon, Apr 2, 2012 at 12:09 PM, Travis Oliphant tra...@continuum.iowrote:
The idea of using constants instead of strings throughout NumPy is an
interesting one, but should be pushed to another thread and not hold up
this particular PR.
I like the suggestion of Nathaniel. Let's get the PR
I think the suggestion is pad(a, 5, mode='mean'), which would be
consistent with common numpy signatures. The mode keyword should probably
have a default, something commonly used. I'd suggest 'mean', Nathaniel
suggests 'zero', I think either would be fine.
I can't type fast enough. :-) I
I rearranged your questions.
Why is this function allocating new arrays that will just be
copied into the big array and then discarded, instead of filling in
the big array directly? (Again, this is a speed issue.)
My example in the e-mail was incorrect (sorry about that). The way it
actually
My suggestion is:
Step 1: Change the current PR so that it has only one user-exposed
function, something like pad(..., mode=foo), and commit that.
Everyone seems to pretty much like that interface, implementing it
would take 1 hour of work, and then the basic functionality would be
landed
On Wed, Mar 28, 2012 at 6:08 PM, Charles R Harris charlesr.har...@gmail.com
wrote:
I think there is also a question of using a prefix pad_xxx for the
function names as opposed to pad.xxx.
If I had it as pad.mean, pad.median, ...etc. then someone could
from numpy.pad import *
a =
I was hoping pad would get finished some day. Maybe 1.9?
Alright - I do like the idea of passing a function to pad, with a bunch of
pre-made functions in place.
Maybe something like:
a = np.arange(10)
b = pad('mean', a, 2, stat_length=3)
where if the first argument is a string, use
I have been developing a set of pad functions to pad arrays in different
ways. Really close to having it accepted into numpy, but I want to revisit
an implementation issue that I have become worried about. Should these
functions be collected into a 'pad' namespace or put raw into np.lib?
Do I
Hello,
I have a pull request to add a n-dimensional array padding feature at
https://github.com/numpy/numpy/pull/198
This message is only to prod for a final review.
Much thanks to Warren Weckesser and Travis Oliphant for their help in
finding some bugs and working on style and API issues. I
20 matches
Mail list logo