Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread David Warde-Farley
On Wed, Dec 23, 2009 at 05:30:16PM -0800, David Goldsmith wrote:
> On Wed, Dec 23, 2009 at 2:26 PM, David Warde-Farley  
> wrote:
> >
> > On 23-Dec-09, at 2:19 PM, David Goldsmith wrote:
> >
> >> Thanks Anne (and Dave): it may seem to you to be "a bit silly to dream
> >> up an API without implementing anything," but I think it's useful to
> >> get these things "on the record" so to speak, and as a person charged
> >> with being especially concerned w/ the doc, it's particularly
> >> important for me to hear when its specific deficiencies are
> >> productivity blockers...
> >
> > In fact, there are gufuncs in the tests that are quite instructive and
> > would form the basis of good documentation, though not enough of them
> > to give a complete picture of what the generalized ufunc architecture
> > can do (I remember looking for an example of a particular supported
> > pattern and coming up short,
> 
> If you came up short, how/why are you certain that the existing arch
> would support it?

The existing documentation made the capabilities of generalized ufuncs
pretty clear, however not much is demonstrated in terms of the appropriate
C API (or code generator) constructs.

David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread David Goldsmith
On Wed, Dec 23, 2009 at 2:26 PM, David Warde-Farley  wrote:
>
> On 23-Dec-09, at 2:19 PM, David Goldsmith wrote:
>
>> Thanks Anne (and Dave): it may seem to you to be "a bit silly to dream
>> up an API without implementing anything," but I think it's useful to
>> get these things "on the record" so to speak, and as a person charged
>> with being especially concerned w/ the doc, it's particularly
>> important for me to hear when its specific deficiencies are
>> productivity blockers...
>
> In fact, there are gufuncs in the tests that are quite instructive and
> would form the basis of good documentation, though not enough of them
> to give a complete picture of what the generalized ufunc architecture
> can do (I remember looking for an example of a particular supported
> pattern and coming up short,

If you came up short, how/why are you certain that the existing arch
would support it?

> though I can't for the life of me
> remember which).
>
> The existing documentation, plus source code from the umath_tests
> module marked up descriptively (what all the parameters do, especially
> the ones which currently receive magic numbers) would probably be the
> way to go down the road.
>
> David

Perfect, David!  Thanks...

DG
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Ironclad v2.6.0rc1 released

2009-12-23 Thread William Reade
Hi all

I'm very happy to announce the latest release (candidate) of Ironclad,
the 120-proof home-brewed CPython compatibility layer, now available
for IronPython 2.6!

No longer need .NET pythonistas toil thanklessly without the benefits
of bz2, csv, numpy and scipy: with a simple 'import ironclad', (most
parts of) the above packages -- and many more -- will transparently
Just Work. For reference: over 1500 tests pass in numpy 1.3.0; over
1900 in 1.4.0RC1; and over 2300 in scipy 0.7.1.

Get the package from:
   http://code.google.com/p/ironclad/

...and get support from:
   http://groups.google.com/group/c-extensions-for-ironpython

...or just ask me directly.

I'm very keen to hear your experiences, both positive and negative; I
haven't been able to test it on as many machines as I have in the
past, so your feedback is especially important this time round*.

Cheers
William

* I'd be especially grateful if someone with a newish multicore
machine would run the numpy and scipy test scripts (included in the
source distrbution) a few times to check for consistent results and
absence of weird crashes; if someone volunteers, I'll help however I
can.
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread David Warde-Farley

On 23-Dec-09, at 2:19 PM, David Goldsmith wrote:

> Thanks Anne (and Dave): it may seem to you to be "a bit silly to dream
> up an API without implementing anything," but I think it's useful to
> get these things "on the record" so to speak, and as a person charged
> with being especially concerned w/ the doc, it's particularly
> important for me to hear when its specific deficiencies are
> productivity blockers...

In fact, there are gufuncs in the tests that are quite instructive and  
would form the basis of good documentation, though not enough of them  
to give a complete picture of what the generalized ufunc architecture  
can do (I remember looking for an example of a particular supported  
pattern and coming up short, though I can't for the life of me  
remember which).

The existing documentation, plus source code from the umath_tests  
module marked up descriptively (what all the parameters do, especially  
the ones which currently receive magic numbers) would probably be the  
way to go down the road.

David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread Anne Archibald
2009/12/23 David Warde-Farley :
> On 23-Dec-09, at 10:34 AM, Anne Archibald wrote:
>
>> The key idea would be that the "linear
>> algebra dimensions" would always be the last one(s); this is fairly
>> easy to arrange with rollaxis when it isn't already true, would tend
>> to reduce copying on input to LAPACK, and is what the gufunc API
>> wants.
>
> Would it actually reduce copying if you were using default C-ordered
> arrays? Maybe I'm mistaken but I thought one almost always had to copy
> in order to translate C to Fortran order except for a few functions
> that can take row-ordered stuff.

That's a good point. But even if you need to do a transpose, it'll be
faster to transpose data in a contiguous block than data scattered all
over memory. Maybe more to the point, broadcasting adds axes to the
beginning, so that (say) two-dimensional arrays can act as "matrix
scalars".

Anne
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread David Goldsmith
On Wed, Dec 23, 2009 at 10:30 AM, David Warde-Farley  
wrote:
> On 23-Dec-09, at 10:34 AM, Anne Archibald wrote:
>
>> It's been a little while since I took a really close look at it, but
>> I'll try to describe the problems I had. Chiefly I had problems with
>> documentation - the only way I could figure out how to build
>> additional gufuncs was monkey-see-monkey-do, just copying an existing
>> one in an existing file and hoping the build system figured it out. It
>> was also not at all clear how to, say, link to LAPACK, let alone
>> decide based on input types which arguments to promote and how to call
>> out to LAPACK.
>
> I tried to create a new generalized ufunc (a logsumexp to go with
> logaddexp, so as to avoid all the needless exp's and log's that would
> be incurred by logaddexp.reduce) and had exactly the same problem. I
> did get it to build but it was misbehaving (returning an array of the
> same size as the input) and I couldn't figure out quite why. I agree
> that the documentation is lacking, but I think it's (rightly) a low
> priority in the midst of the release candidate.

Thanks Anne (and Dave): it may seem to you to be "a bit silly to dream
up an API without implementing anything," but I think it's useful to
get these things "on the record" so to speak, and as a person charged
with being especially concerned w/ the doc, it's particularly
important for me to hear when its specific deficiencies are
productivity blockers...

>> The key idea would be that the "linear
>> algebra dimensions" would always be the last one(s); this is fairly
>> easy to arrange with rollaxis when it isn't already true, would tend
>> to reduce copying on input to LAPACK, and is what the gufunc API
>> wants.
>
> Would it actually reduce copying if you were using default C-ordered
> arrays? Maybe I'm mistaken but I thought one almost always had to copy
> in order to translate C to Fortran order except for a few functions
> that can take row-ordered stuff.
>
> Otherwise, +1 all the way.

...and of course, discussing these things here begins a dialog that
can be the beginning of getting these improvements made - not
necessarily by you... :-)

Thanks again, for humoring me

DG
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread David Warde-Farley
On 23-Dec-09, at 10:34 AM, Anne Archibald wrote:

> It's been a little while since I took a really close look at it, but
> I'll try to describe the problems I had. Chiefly I had problems with
> documentation - the only way I could figure out how to build
> additional gufuncs was monkey-see-monkey-do, just copying an existing
> one in an existing file and hoping the build system figured it out. It
> was also not at all clear how to, say, link to LAPACK, let alone
> decide based on input types which arguments to promote and how to call
> out to LAPACK.

I tried to create a new generalized ufunc (a logsumexp to go with  
logaddexp, so as to avoid all the needless exp's and log's that would  
be incurred by logaddexp.reduce) and had exactly the same problem. I  
did get it to build but it was misbehaving (returning an array of the  
same size as the input) and I couldn't figure out quite why. I agree  
that the documentation is lacking, but I think it's (rightly) a low  
priority in the midst of the release candidate.

> The key idea would be that the "linear
> algebra dimensions" would always be the last one(s); this is fairly
> easy to arrange with rollaxis when it isn't already true, would tend
> to reduce copying on input to LAPACK, and is what the gufunc API
> wants.

Would it actually reduce copying if you were using default C-ordered  
arrays? Maybe I'm mistaken but I thought one almost always had to copy  
in order to translate C to Fortran order except for a few functions  
that can take row-ordered stuff.

Otherwise, +1 all the way.

David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Matlab's griddata3 for numpy?

2009-12-23 Thread Nadav Horesh
You probably have to use the generic interpolation function from 
scipy.interpolate module:
 scipy.interpolate.splprep,  scipy.interpolate.splev, etc.

It could be cumbersome but doable.

   Nadav


-Original Message-
From: numpy-discussion-boun...@scipy.org on behalf of reckoner
Sent: Wed 23-Dec-09 16:12
To: numpy-discussion@scipy.org
Subject: [Numpy-discussion] Matlab's griddata3 for numpy?
 
Hi,

I realize that there is a griddata for numpy via matplotlib, but is 
there a griddata3 (same has griddata, but for higher dimensions).

Any help appreciated.


<>___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] LA improvements (was: dot function or dot notation, matrices, arrays?)

2009-12-23 Thread Anne Archibald
2009/12/23 David Goldsmith :
> Starting a new thread for this.
>
> On Tue, Dec 22, 2009 at 7:13 PM, Anne Archibald
>  wrote:
>
>> I think we have one major lacuna: vectorized linear algebra. If I have
>> to solve a whole whack of four-dimensional linear systems, right now I
>> need to either write a python loop and use linear algebra on them one
>> by one, or implement my own linear algebra. It's a frustrating lacuna,
>> because all the machinery is there: generalized ufuncs and LAPACK
>> wrappers. Somebody just needs to glue them together. I've even tried
>> making a start on it, but numpy's ufunc machinery and generic type
>> system is just too much of a pain for me to make any progress as is.
>
> Please be more specific: what (which aspects) have been "too much of a
> pain"?  (I ask out of ignorance, not out of challenging your
> opinion/experience.)

It's been a little while since I took a really close look at it, but
I'll try to describe the problems I had. Chiefly I had problems with
documentation - the only way I could figure out how to build
additional gufuncs was monkey-see-monkey-do, just copying an existing
one in an existing file and hoping the build system figured it out. It
was also not at all clear how to, say, link to LAPACK, let alone
decide based on input types which arguments to promote and how to call
out to LAPACK.

I'm not saying this is impossible, just that it was enough frustrating
no-progress to defeat my initial "hey, I could do that" impulse.

>> I think if someone wanted to start building a low-level
>
> Again, please be more specific: what do you mean by this?  (I know
> generally what is meant by "low level," but I'd like you to spell out
> a little more fully what you mean by this in this context.)

Sure. Let me first say that all this is kind of beside the point - the
hard part is not designing an API, so it's a bit silly to dream up an
API without implementing anything.

I had pictured two interfaces to the vectorized linear algebra code.
The first would simply provide more-or-less direct access to
vectorized versions of the linear algebra functions we have now, with
no dimension inference. Thus inv, pinv, svd, lu factor, lu solve, et
cetera - but not dot. Dot would have to be split up into
vector-vector, vector-matrix, matrix-vector, and matrix-matrix
products, since one can no longer use the dimensionality of the inputs
to figure out what is wanted. The key idea would be that the "linear
algebra dimensions" would always be the last one(s); this is fairly
easy to arrange with rollaxis when it isn't already true, would tend
to reduce copying on input to LAPACK, and is what the gufunc API
wants.

This is mostly what I meant by low-level. (A second generation would
do things like combine many vector-vector products into a single
LAPACK matrix-vector product.)

The higher-level API I was imagining - remember, vaporware here - had
a Matrix and a Vector class, each holding an arbitrarily-dimensioned
array of the relevant object. The point of this is to avoid having to
constantly specify whether you want a matrix-vector or matrix-matrix
product; it also tidily avoids the always-two-dimensional nuisance of
the current matrix API.


Anne
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [ANN] numpy 1.4.0 rc2

2009-12-23 Thread Bruce Southey
On 12/22/2009 09:53 PM, David Cournapeau wrote:
> On Wed, Dec 23, 2009 at 12:50 AM, Bruce Southey  wrote:
>
>
>> This still crashes Python 2.7 with the test_multiarray.TestIO.test_ascii.
>>  
> Could you file a ticket next time ? I could not follow closely the
> discussion the last week or so, and although I saw the crash, I missed
> it was discussed already.
>
> thanks,
>
> David
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
Sorry,
Ticket 1345
http://projects.scipy.org/numpy/ticket/1345

I added patches for the 1.4 rc2 version and a patch for the SVN version.

I only tested the 1.4 branch on Python 2.7 after you announced it 
because I follow the SVN. It was also somewhat confusing because a fix 
is was in the SVN version except that it needed to include Python 2.7. 
(This was due to the Python 3 support that was added since the 1.4 branch.)

Some of the Python 3.1 features have been backported to Python 2.7 which 
will help with some of the porting to Python 3. For that reason, I would 
suggest that release notes indicate that Python 2.7 support is 
experimental - especially as Python 2.7 has only had one alpha release 
and the expected final release is 2010-06-26.


Bruce
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Matlab's griddata3 for numpy?

2009-12-23 Thread reckoner

Hi,

I realize that there is a griddata for numpy via matplotlib, but is 
there a griddata3 (same has griddata, but for higher dimensions).


Any help appreciated.

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion