Hi, I met the same problem.
Have you figured out how to solve it?

在 2015年12月23日星期三 UTC+8上午3:39:52,Hamid Reza Hassanzadeh写道:
>
> Hello,
> Thanks,
> No I'm doing regression.
>
> On Monday, December 21, 2015 at 3:12:05 PM UTC-5, nouiz wrote:
>>
>> Hi,
>>
>> you can use the HINT from the error message to find the exact line number 
>> where the problem come from.
>>
>> mse is useful when doing regression. so your target should be a vector, 
>> with only one value per example. Then the 2 vector will broadcast correctly 
>> together.
>>
>> Are you doing classification? If so, I should change your cost. If the 
>> class are ordered, then you can use them as a regression and use the mse 
>> cost, but you seem to use them as different not ordered class.
>>
>> Fred
>>
>> On Sun, Dec 20, 2015 at 8:41 PM, Hamid Reza Hassanzadeh <
>> [email protected]> wrote:
>>
>>> Hi everyone,
>>> I'm trying to find the weights of a neural network with mean square 
>>> error cost using gradient descent. Here is the theano function I  create:
>>> train_fn = theano.function(
>>>             inputs=[train_index],
>>>             outputs=self.finetune_cost,
>>>             updates=updates,
>>>             givens={
>>>                 self.x: dataset_x[train_index],
>>>                 self.y: dataset_y[train_index]
>>>             }
>>>         )
>>>
>>> with self.x = T.matrix('x')  and  self.y = T.dvector('y') and the 
>>> finetune_cost is defined as: 
>>> self.finetune_cost = self.regLayer.mean_sq_error(self.y)
>>>
>>> def mean_sq_error(self,y):
>>>         return T.mean((self.y_pred-y)**2)
>>>
>>> Now the problem is that  when I compile I get the following error:
>>> ValueError: Input dimension mis-match. (input[0].shape[1] = 1, 
>>> input[2].shape[1] = 46)
>>> Apply node that caused the error: Elemwise{Composite{((i0 + i1) - 
>>> i2)}}[(0, 0)](Dot22.0, InplaceDimShuffle{x,0}.0, InplaceDimShuffle{x,0}.0)
>>> Toposort index: 30
>>> Inputs types: [TensorType(float64, matrix), TensorType(float64, row), 
>>> TensorType(float64, row)]
>>> Inputs shapes: [(46L, 1L), (1L, 1L), (1L, 46L)]
>>> Inputs strides: [(8L, 8L), (8L, 8L), (368L, 8L)]
>>> Inputs values: ['not shown', array([[ 0.]]), 'not shown']
>>> Outputs clients: [[Elemwise{sqr,no_inplace}(Elemwise{Composite{((i0 + 
>>> i1) - i2)}}[(0, 0)].0), Elemwise{Composite{((i0 * i1) / i2)}}[(0, 
>>> 1)](TensorConstant{(1L, 1L) of 2.0}, Elemwise{Composite{((i0 + i1) - 
>>> i2)}}[(0, 0)].0, Elemwise{mul,no_inplace}.0)]]
>>>
>>> HINT: Re-running with most Theano optimization disabled could give you a 
>>> back-trace of when this node was created. This can be done with by setting 
>>> the Theano flag 'optimizer=fast_compile'. If that does not work, Theano 
>>> optimizations can be disabled with 'optimizer=None'.
>>> HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint 
>>> and storage map footprint of this apply node.
>>>
>>>
>>> I guess the problem is that self.y_pred is a column matrix whereas the y 
>>> is a dvector which is treated as a row matrix. What should I do?
>>>
>>>
>>>
>>> -- 
>>>
>>> --- 
>>> You received this message because you are subscribed to the Google 
>>> Groups "theano-users" group.
>>> To unsubscribe from this group and stop receiving emails from it, send 
>>> an email to [email protected].
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to