Hi All, 

Now I come across a similar problem, but slightly different:

Suppose we have a 3-D tensor T with dimension KxGxH and a 3-D tensor M with 
dimension FxHxL, what if I want to do batched matrix multiplication on 
these two tensors?
This is an extension to what Han Zhao asked, by repeating his 3D-2D matrix 
dot product for F times, thus forms an extra dimension in the resulting 
tensor. The resulting tensor will have a dimension of KxFxGxL.

Is that possible to do it within T.batched_tensordot()?

Zhouhan

On Thursday, November 6, 2014 at 4:22:10 AM UTC-5, Sander Dieleman wrote:
>
> It's built on top of that :) I just wrapped some of these operations in 
> Theano Ops, so I could use them in Theano graphs. I did the same before to 
> implement the FFT convolution.
>
> Sander
>
> On Thursday, November 6, 2014 7:17:34 AM UTC+1, Wei Tan wrote:
>>
>> Sander, thanks a lot for your pointer. Is it similar to this:
>>
>> https://github.com/lebedov/scikits.cuda/blob/master/scikits/cuda/cublas.py
>>
>> This scikits seem to have implement a lot of cublas batch operations, in 
>> Python API.
>>
>>
>> On Wednesday, November 5, 2014 5:42:52 AM UTC-5, Sander Dieleman wrote:
>>>
>>> As mentioned before T.batched_dot() uses scan, which is not always 
>>> ideal. If M and N are relatively small, you may benefit from using this: 
>>> https://github.com/benanne/theano_wmf/blob/master/ops.py#L126
>>> This version of batched_dot implements the same operation but relies on 
>>> cublasSgemmBatched, which is able to do the dot products in parallel 
>>> (T.batched_dot will do them in sequence because of scan). It only works on 
>>> the GPU and requires the latest version of scikits.cuda.
>>>
>>> Sander
>>>
>>> On Wednesday, November 5, 2014 7:27:27 AM UTC+1, Wei Tan wrote:
>>>>
>>>> maybe T.dot() works for (K*H*H) * (H*H) = K*H*H
>>>>
>>>> I was doing (K*M*N)*(K*N*M) = K*M*M, it seems that there is no 
>>>> vectorized solution and I had to use T.batched_dot().
>>>>
>>>> T.dot() does not give the answer I want -- it yields a 4D tensor ...
>>>>
>>>> Wei
>>>>
>>>> On Wednesday, November 5, 2014 12:26:59 AM UTC-5, Han Zhao wrote:
>>>>>
>>>>> Hi Fred and Wei:
>>>>>
>>>>>
>>>>> Thanks for the suggestions very much! Yes, using scan can definitely 
>>>>> finish it but what I want is a vectorized solution, as Wei pointed out. 
>>>>> Finally I found that T.dot()
>>>>> can simply do the trick without any further modification. When T.dot() 
>>>>> is applied to high order tensors, it automatically generalizes to the 
>>>>> inner 
>>>>> product in high dimensional 
>>>>> tensors. 
>>>>>
>>>>> Again, thanks for the helps very much!
>>>>>
>>>>>
>>>>>
>>>>> Han.
>>>>>
>>>>> 在 2014年11月5日星期三UTC+8下午1时11分38秒,Wei Tan写道:
>>>>>>
>>>>>> tensor.batched_dot does the trick for batch matrix multiplication, 
>>>>>> using scan
>>>>>>
>>>>>>
>>>>>> http://deeplearning.net/software/theano/library/tensor/basic.html#tensor.batched_dot
>>>>>> tensor.batched_dot(*X*, *Y*) 
>>>>>> Parameters: 
>>>>>>    
>>>>>>    - *x* – A Tensor with sizes e.g.: for 3D (dim1, dim3, dim2)
>>>>>>    - *y* – A Tensor with sizes e.g.: for 3D (dim1, dim2, dim4)
>>>>>>
>>>>>> This function computes the dot product between the two tensors, by 
>>>>>> iterating over the first dimension using scan. Returns a tensor of size 
>>>>>> e.g. if it is 3D: (dim1, dim3, dim4) Example:
>>>>>>
>>>>>> >>> first = T.tensor3('first')>>> second = T.tensor3('second')>>> result 
>>>>>> >>> = batched_dot(first, second)
>>>>>>
>>>>>> Note : 
>>>>>>
>>>>>> This is a subset of numpy.einsum, but we do not provide it for now. 
>>>>>> But numpy einsum is slower than dot or tensordot: 
>>>>>> http://mail.scipy.org/pipermail/numpy-discussion/2012-October/064259.html
>>>>>> Parameters: 
>>>>>>    
>>>>>>    - *X* (*symbolic tensor*) – left term
>>>>>>    - *Y* (*symbolic tensor*) – right term
>>>>>>
>>>>>> Returns: 
>>>>>>
>>>>>> tensor of products
>>>>>>
>>>>>>
>>>>>> On Wednesday, November 5, 2014 12:06:15 AM UTC-5, Wei Tan wrote:
>>>>>>>
>>>>>>> Hi Han, as Fred pointed out, T*M will do element-wise 
>>>>>>> multiplication, with broadcasting.
>>>>>>>
>>>>>>> If you want to do matrix multiplication, I could not find a 
>>>>>>> "vectorized" implementation.
>>>>>>>  
>>>>>>>
>>>>>>> On Tuesday, November 4, 2014 11:37:34 AM UTC-5, nouiz wrote:
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Nov 3, 2014 at 9:35 PM, Han Zhao <[email protected]> 
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi, 
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> I know it's a simple question but I failed to find the answer I 
>>>>>>>>> want through google, so hope somebody can help me on this.
>>>>>>>>> Does theano support the Tensor-Matrix/Vector multiplication? 
>>>>>>>>> **Note**: Not the tensor product in mathematical meaning.
>>>>>>>>>
>>>>>>>>> For example, given a 3rd order tensor T with dimension KxHxH and a 
>>>>>>>>> matrix M with dimension HxH, I would like to obtain 
>>>>>>>>> a tensor with dimension KxHxH, where each slice of the resulting 
>>>>>>>>> tensor is the matrix product of T[i, :, :] and M.
>>>>>>>>>
>>>>>>>>
>>>>>>>> Just do T * M. This do element wise multiplication of T[i, :, :] 
>>>>>>>> and M. This is called broadcasting.
>>>>>>>>
>>>>>>>> If you want the dot product between T[i, :, :] and M, you can use 
>>>>>>>> scan to loop over T.
>>>>>>>>
>>>>>>>> http://deeplearning.net/software/theano/tutorial/loop.html#scan
>>>>>>>>
>>>>>>>>  
>>>>>>>>
>>>>>>>>> Another example is, given a 3rd order tensor T with dimension 
>>>>>>>>> KxHxH and a vector V with dimension Hx1, I would like to obtain 
>>>>>>>>> a matrix (degenerated 3rd tensor) with dimension KxH, where each 
>>>>>>>>> row of the resulting matrix is the matrix/vector product of T[i, :, 
>>>>>>>>> :] and 
>>>>>>>>> V.
>>>>>>>>>
>>>>>>>>
>>>>>>>> Idem, you can use scan.
>>>>>>>>
>>>>>>>> Fred
>>>>>>>>
>>>>>>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to