Friedrich Romstedt wrote:
2010/6/13 Alan Bromborsky abro...@verizon.net:
Friedrich Romstedt wrote:
I am writing symbolic tensor package for general relativity. In making
symbolic tensors concrete
I generate numpy arrays stuffed with sympy functions and symbols.
That
Alan Bromborsky wrote:
Dag Sverre Seljebotn wrote:
Did you have a look at the tensors in Theano? They seem to merge tensor
algebra, SymPy, NumPy and (optional) GPU computing etc. Even if it
doesn't fill your needs it could perhaps be a better starting point?
On Sun, Jun 13, 2010 at 8:11 PM, Alan Bromborsky abro...@verizon.net wrote:
Friedrich Romstedt wrote:
2010/6/13 Pauli Virtanen p...@iki.fi:
def tensor_contraction_single(tensor, dimensions):
Perform a single tensor contraction over the dimensions given
swap = [x for x in
Sebastian Walter wrote:
On Sun, Jun 13, 2010 at 8:11 PM, Alan Bromborsky abro...@verizon.net wrote:
Friedrich Romstedt wrote:
2010/6/13 Pauli Virtanen p...@iki.fi:
def tensor_contraction_single(tensor, dimensions):
Perform a single tensor contraction over the dimensions
2010/6/13 Alan Bromborsky abro...@verizon.net:
Friedrich Romstedt wrote:
I am writing symbolic tensor package for general relativity. In making
symbolic tensors concrete
I generate numpy arrays stuffed with sympy functions and symbols.
That sound's interesting.
Now, after I read the
Did you have a look at the tensors in Theano? They seem to merge tensor
algebra, SymPy, NumPy and (optional) GPU computing etc. Even if it
doesn't fill your needs it could perhaps be a better starting point?
http://deeplearning.net/software/theano/library/tensor/basic.html
Dag Sverre
Alan
Dag Sverre Seljebotn wrote:
Did you have a look at the tensors in Theano? They seem to merge tensor
algebra, SymPy, NumPy and (optional) GPU computing etc. Even if it
doesn't fill your needs it could perhaps be a better starting point?
2010/6/13 Pauli Virtanen p...@iki.fi:
def tensor_contraction_single(tensor, dimensions):
Perform a single tensor contraction over the dimensions given
swap = [x for x in range(tensor.ndim)
if x not in dimensions] + list(dimensions)
x = tensor.transpose(swap)
for k in
Friedrich Romstedt wrote:
2010/6/13 Pauli Virtanen p...@iki.fi:
def tensor_contraction_single(tensor, dimensions):
Perform a single tensor contraction over the dimensions given
swap = [x for x in range(tensor.ndim)
if x not in dimensions] + list(dimensions)
x =
2010/6/13 Alan Bromborsky abro...@verizon.net:
I am writing symbolic tensor package for general relativity. In making
symbolic tensors concrete
I generate numpy arrays stuffed with sympy functions and symbols.
That sound's interesting.
The
operations are tensor product
Is this not what
core.numeric.tensordothttp://docs.scipy.org/numpy/docs/numpy.core.numeric.tensordot/does?
DG
On Sun, Jun 13, 2010 at 12:37 PM, Friedrich Romstedt
friedrichromst...@gmail.com wrote:
2010/6/13 Alan Bromborsky abro...@verizon.net:
I am writing symbolic tensor package for
Friedrich Romstedt wrote:
2010/6/13 Alan Bromborsky abro...@verizon.net:
I am writing symbolic tensor package for general relativity. In making
symbolic tensors concrete
I generate numpy arrays stuffed with sympy functions and symbols.
That sound's interesting.
The
On Sat, Jun 12, 2010 at 4:30 PM, Alan Bromborsky abro...@verizon.net wrote:
If I have a single numpy array, for example with 3 indices T_{ijk} and I
want to sum over two them in the sense of tensor contraction -
T_{k} = \sum_{i=0}^{n-1} T_{iik}. Is there an easy way to do this with
numpy?
2010/6/12 Alan Bromborsky abro...@verizon.net:
If I have a single numpy array, for example with 3 indices T_{ijk} and I
want to sum over two them in the sense of tensor contraction -
T_{k} = \sum_{i=0}^{n-1} T_{iik}. Is there an easy way to do this with
numpy?
Also you can give:
T[I, I,
Friedrich Romstedt wrote:
2010/6/12 Alan Bromborsky abro...@verizon.net:
If I have a single numpy array, for example with 3 indices T_{ijk} and I
want to sum over two them in the sense of tensor contraction -
T_{k} = \sum_{i=0}^{n-1} T_{iik}. Is there an easy way to do this with
numpy?
Sat, 12 Jun 2010 23:15:16 +0200, Friedrich Romstedt wrote:
[clip]
But note that for:
T[:, I, I]
the shape is reversed with respect to that of:
T[I, :, I] and T[I, I, :] .
I think it should be written in the docs how the shape is derived.
It's explained there in detail (although maybe not
Sat, 12 Jun 2010 16:30:14 -0400, Alan Bromborsky wrote:
If I have a single numpy array, for example with 3 indices T_{ijk} and I
want to sum over two them in the sense of tensor contraction -
T_{k} = \sum_{i=0}^{n-1} T_{iik}. Is there an easy way to do this with
numpy?
HTH, (not really
17 matches
Mail list logo