> On 17 Mar 2015, at 09:11, Dieter Van Eessen <dieter.van.ees...@gmail.com> 
> wrote:
> 
> Hello,
> 
> Sorry to disturb again, but the topic still bugs me somehow...
> I'll try to rephrase the question:
> 
> - What's the influence of the type of N-array representation with respect to 
> TENSOR-calculus?
> - Are multiple representations possible?
> - I assume that the order of the dimensions plays a major role in for example 
> TENSOR product.
> Is this assumption correct?
> 
> As I said before, my math skills are lacking in this area... 
> I hope you consider this a valid question.
> 
> kind regards,
> Dieter
> 
> 
> On Fri, Jan 30, 2015 at 2:32 AM, Alexander Belopolsky <ndar...@mac.com> wrote:
> 
> On Mon, Jan 26, 2015 at 6:06 AM, Dieter Van Eessen 
> <dieter.van.ees...@gmail.com> wrote:
> I've read that numpy.array isn't arranged according to the 'right-hand-rule' 
> (right-hand-rule => thumb = +x; index finger = +y, bend middle finder = +z). 
> This is also confirmed by an old message I dug up from the mailing list 
> archives. (see message below)
> 
> Dieter,
> 
> It looks like you are confusing dimensionality of the array with the 
> dimensionality of a vector that it might store.  If you are interested in 
> using numpy for 3D modeling, you will likely only encounter 1-dimensional 
> arrays (vectors) of size 3 and 2-dimensional arrays  (matrices) of size 9 or 
> shape (3, 3).
> 
> A 3-dimensional array is a stack of matrices and the 'right-hand-rule' does 
> not really apply.  The notion of C/F-contiguous deals with the order of axes 
> (e.g. width first or depth first) while the right-hand-rule is about the 
> direction of the axes (if you "flip" the middle finger right hand becomes 
> left.)  In the case of arrays this would probably correspond to little-endian 
> vs. big-endian: is a[0] stored at a higher or lower address than a[1].  
> However, whatever the answer to this question is for a particular system, it 
> is the same for all axes in the array, so right-hand - left-hand distinction 
> does not apply. 
> 

Hi,

let us say you have an n-dimensional rank k tensor. Then you could represent 
that beast as a numpy array of dimension k*n. So, if n is 4 and k is 2 you have 
a 4D tensor of rank 2. This tensor would be an array of shape (4,4). If you 
want to have higher order tensors you have higher dimensional arrays, a tensor 
of rank 3 would be an array of shape (4,4,4) in this example. Of course you 
have to be careful over which axes you do tensor operations.

Something like V_ik = C_ijk v_j would then mean that you want to sum over the 
first axis of the 3-dimensional array C (of shape (4,4,4) and along the 
zero-axis of the 1-dimensional array v_j (of shape (4,)).

By the way, of you think about doing those things, np.einsum might help. Of 
course, knowing which axis in the numpy array corresponds to which axis (or 
index) in your tensor is quite important. 

Hanno


_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to