Re: [Numpy-discussion] doing zillions of 3x3 determinants fast

2008-08-25 Thread Dan Lenski
On Sun, 24 Aug 2008 23:49:45 -0600, Charles R Harris wrote:

 On Sun, Aug 24, 2008 at 9:48 PM, Daniel Lenski [EMAIL PROTECTED]
 wrote:
 
 Hi all,
 I need to take the determinants of a large number of 3x3 matrices, in
 order to determine for each of N points, in which of M tetrahedral
 cells they lie.  I arrange the matrices in an ndarray of shape
 (N,M,5,3,3).


 If you step back a bit and describe the actual problem you have, rather
 than your current solution, it might be that there are algorithms in
 scipy to solve it.


Hi Charles,
I have an irregular/unstructured mesh of tetrahedral cells, and I need to 
interpolate some data over this mesh, at arbitrary points within the 
complete volume.
 
So this is basically 3D interpolation.  I've done some looking into this 
already, and it seems that the only facility for 3D interpolation in 
Scipy is map_coordinates, which only works on a regular grid.

So I have to roll my own interpolation!  I start by trying to figure out 
in which of the tetrahedral cells each interpolation point lies.  The 
standard way to do this is to check that for every three vertices of each 
tetrahedron (v1,v2,v3), the point in question lies on the same side as 
the fourth point (v4).  This can be checked with:

| v1x-xv1y-y   v1z-z | | v1x-v4xv1y-v4y   v1z-v4z |
| v2x-xv2y-y   v2z-z |  =  | v2x-v4xv2y-v4y   v2z-v4z |
| v3x-xv3y-y   v3z-z | | v3x-v4xv3y-v4y   v3z-v4z |

(Here's a description of a nearly identical algorithm: http://
steve.hollasch.net/cgindex/geometry/ptintet.html)

Dan

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] doing zillions of 3x3 determinants fast

2008-08-24 Thread Daniel Lenski
Hi all,
I need to take the determinants of a large number of 3x3 matrices, in 
order to determine for each of N points, in which of M tetrahedral cells 
they lie.  I arrange the matrices in an ndarray of shape (N,M,5,3,3).

As far as I can tell, Numpy doesn't have a function to do determinants 
over a specific set of axes... so I can't say:
  dets = np.linalg.det(a, axis1=-1, axis2=-1)

Using an explicit Python loop is very slow... doing the determinants of 
100,000 random 3x3 matrices in a loop and inserting the results into a 
preallocated results array takes about 5.1 seconds.

So instead I coded up my own routine to do the determinants over the last 
2 axes, based on the naive textbook formula for a 3x3 determinant:
  def det3(ar):
a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2]
d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2]
g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2]
return (a*e*i+b*f*g+c*d*h)-(g*e*c+h*f*a+i*d*b)

This is *much much* faster... it does the same 100,000 determinants in 
0.07 seconds.  And the results differ from linalg.det's results by less 
than 1 part in 10^15 on average (w/ a std dev of about 1 part in 10^13).

Does anyone know of any well-written routines to do lots and lots of 
determinants of a fixed size?  My routine has a couple problems I can see:
  * it's fast enough for 100,000 determinants, but it bogs due to 
all the temporary arrays when I try to do 1,000,000 determinants
(=72 MB array)
  * I've heard the numerical stability of the naive determinant 
algorithms is fairly poor, so I'm reluctant to use this function on 
real data.

Any advice will be appreciated!

Dan

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] doing zillions of 3x3 determinants fast

2008-08-24 Thread Daniel Lenski
On Mon, 25 Aug 2008 03:48:54 +, Daniel Lenski wrote:
   * it's fast enough for 100,000 determinants, but it bogs due to
 all the temporary arrays when I try to do 1,000,000 determinants
 (=72 MB array)

I've managed to reduce the memory usage significantly by getting the 
number of temporary arrays down to exactly two:

def det3(ar):
a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2]
d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2]
g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2]

t=a.copy(); t*=e; t*=i; tot =t
t=b.copy(); t*=f; t*=g; tot+=t
t=c.copy(); t*=d; t*=h; tot+=t
t=g.copy(); t*=e; t*=c; tot-=t
t=h.copy(); t*=f; t*=a; tot-=t
t=i.copy(); t*=d; t*=b; tot-=t

return tot

Now it runs very fast with 1,000,000 determinants to do (10X the time 
required to do 100,000) but I'm still worried about the stability with 
real-world data.

Dan

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] doing zillions of 3x3 determinants fast

2008-08-24 Thread Anne Archibald
2008/8/25 Daniel Lenski [EMAIL PROTECTED]:
 On Mon, 25 Aug 2008 03:48:54 +, Daniel Lenski wrote:
   * it's fast enough for 100,000 determinants, but it bogs due to
 all the temporary arrays when I try to do 1,000,000 determinants
 (=72 MB array)

 I've managed to reduce the memory usage significantly by getting the
 number of temporary arrays down to exactly two:

 def det3(ar):
a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2]
d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2]
g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2]

t=a.copy(); t*=e; t*=i; tot =t
t=b.copy(); t*=f; t*=g; tot+=t
t=c.copy(); t*=d; t*=h; tot+=t
t=g.copy(); t*=e; t*=c; tot-=t
t=h.copy(); t*=f; t*=a; tot-=t
t=i.copy(); t*=d; t*=b; tot-=t

return tot

 Now it runs very fast with 1,000,000 determinants to do (10X the time
 required to do 100,000) but I'm still worried about the stability with
 real-world data.

As far as I know, that's the best you can do (though come to think of
it, a 3D determinant is a cross-product followed by a dot product, so
you might be able to cheat and use some built-in routines). It's a
current limitation of numpy that there is not much support for doing
many linear algebra problems. We do have perennial discussions about
improving support for array-of-matrices calculations, and in fact
currently in numpy SVN is code to allow C code doing a 3x3 determinant
to be easily made into a generalized ufunc that does ufunc-like
broadcasting and iteration over all but the last two axes.

Anne
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] doing zillions of 3x3 determinants fast

2008-08-24 Thread Charles R Harris
On Sun, Aug 24, 2008 at 9:48 PM, Daniel Lenski [EMAIL PROTECTED] wrote:

 Hi all,
 I need to take the determinants of a large number of 3x3 matrices, in
 order to determine for each of N points, in which of M tetrahedral cells
 they lie.  I arrange the matrices in an ndarray of shape (N,M,5,3,3).


If you step back a bit and describe the actual problem you have, rather than
your current solution, it might be that there are algorithms in scipy to
solve it.

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion