Hi, did you solve this? 

Regards.

El viernes, 17 de julio de 2015, 13:27:06 (UTC+2), Abhishek Shivkumar 
escribió:
>
> I have the following as well and wanted to know if the below method is the 
> right way to do it
>
> self.output = theano.ifelse.ifelse(T.eq(is_train, 0), output_dropped, out 
> * (1.0 - p_drop))
>
>
> On Friday, July 17, 2015 at 12:03:35 PM UTC+1, Abhishek Shivkumar wrote:
>>
>> Hi Daniel
>>
>>  I think I still have a problem with this in my code.
>>
>> I am using as follows:
>>
>> self.output = T.switch(T.eq(is_train, 0), self._dropout_from_layer(rng, 
>> my_layer=out, p_drop=p_drop), out * (1.0 - p_drop))
>>
>>
>> Is this the wrong? 
>>
>>
>> On Friday, July 17, 2015 at 10:53:24 AM UTC+1, Abhishek Shivkumar wrote:
>>>
>>> Thanks Daniel. That was the exact mistake I was doing.
>>>
>>> Thanks for the resolution. It now works fine. 
>>>
>>> Abhishek S
>>>
>>> On Friday, July 17, 2015 at 9:07:25 AM UTC+1, Daniel Renshaw wrote:
>>>>
>>>> When you say an if statement, do you mean a Python if statement?
>>>>
>>>> if T.neq(...):
>>>>
>>>> won't work because T.neq(...) is a symbolic expression and doesn't 
>>>> evaluate to true or false as Python requires. Theano does have a symbolic 
>>>> if statement though:
>>>>
>>>> theano.ifelse.ifelse(T.neq(...).all(), A, B)
>>>>
>>>> should work fine.
>>>>
>>>> Note though that ifelse takes a boolean condition while T.switch takes 
>>>> a tensor and evaluates it elementwise.
>>>>
>>>> Daniel
>>>>
>>>>
>>>> On 16 July 2015 at 14:48, Abhishek Shivkumar <[email protected]> 
>>>> wrote:
>>>>
>>>>> I just wanted to inform that I see it kind of resolved.
>>>>>
>>>>> The thing is that I was using T.neq( ) in a if statement. It looks 
>>>>> like it cannot be used in a if statement and should be used only part of  
>>>>> the T.switch statement.
>>>>>
>>>>> Thanks
>>>>> Abhishek S
>>>>>
>>>>>
>>>>> On Thursday, July 16, 2015 at 1:59:28 PM UTC+1, Abhishek Shivkumar 
>>>>> wrote:
>>>>>>
>>>>>> Thanks for the reply.
>>>>>>
>>>>>> I am using the following method to define the train method
>>>>>>
>>>>>>     
>>>>>>
>>>>>> train_model = theano.function(
>>>>>>
>>>>>> on_unused_input='ignore',
>>>>>>
>>>>>> inputs=[index, l],
>>>>>>
>>>>>> outputs=cost,
>>>>>>
>>>>>> updates=updates,
>>>>>>
>>>>>> givens={
>>>>>>
>>>>>> x: train_set_x[index * batch_size: (index + 1) * batch_size],
>>>>>>
>>>>>> y: train_set_y[index * batch_size: (index + 1) * batch_size, :],
>>>>>>
>>>>>> lr: l,
>>>>>>
>>>>>> is_train: numpy.cast['int32'](0)
>>>>>>
>>>>>> }
>>>>>>
>>>>>> )
>>>>>>
>>>>>>
>>>>>> As you see above, I am passing is_train to denote it is training.
>>>>>>
>>>>>>
>>>>>> also, I have a validate method as follows:
>>>>>>
>>>>>>
>>>>>> predict_valid = theano.function(inputs=[index],
>>>>>>
>>>>>> outputs=(layer3.y_pred),
>>>>>>
>>>>>> on_unused_input='ignore',
>>>>>>
>>>>>> givens={
>>>>>>
>>>>>> x: valid_set_x[index * batch_size: (index + 1) * batch_size],
>>>>>>
>>>>>> is_train: numpy.cast['int32'](1)
>>>>>>
>>>>>> })
>>>>>>
>>>>>> As you see, I have the same is_train variable passed but it is 1 to 
>>>>>> denote it is validation.
>>>>>> is_train is defined as follows
>>>>>>
>>>>>> is_train = T.iscalar('is_train') 
>>>>>>
>>>>>> In the hidden layer, I am using this variable as follows:
>>>>>>
>>>>>> if T.eq(is_train, 0): # Training
>>>>>>
>>>>>> self.output = T.tanh(T.dot(input, self.W) + self.b)
>>>>>>
>>>>>> self.output = self._dropout_from_layer(rng, my_layer=self.output, 
>>>>>> p_drop=p_drop)
>>>>>>
>>>>>> else:
>>>>>>
>>>>>> self.output = T.tanh(T.dot(input, self.W) + self.b)
>>>>>>
>>>>>> self.output = self.output * (1.0 - p_drop)
>>>>>>
>>>>>> As you see above, I am dropping values if it is training and scaling 
>>>>>> down the values of the output if it is validation.
>>>>>>
>>>>>> but, when I run my code, I get this error:
>>>>>>
>>>>>> raise UnusedInputError(msg % (inputs.index(i), i.variable, err_msg))
>>>>>> theano.compile.function_module.UnusedInputError: theano.function was 
>>>>>> asked to create a function computing outputs given certain inputs, but 
>>>>>> the 
>>>>>> provided input variable at index 2 is not part of the computational 
>>>>>> graph 
>>>>>> needed to compute the outputs: <TensorType(int32, scalar)>.
>>>>>> To make this error into a warning, you can pass the parameter 
>>>>>> on_unused_input='warn' to theano.function. To disable it completely, use 
>>>>>> on_unused_input='ignore'.
>>>>>>
>>>>>> It says I am using an input that is not part of the graph. I am using 
>>>>>> the variable as a condition to determine the output above, right? So, 
>>>>>> can 
>>>>>> you please help me resolve this?
>>>>>>
>>>>>> Thanks again!
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thursday, July 16, 2015 at 12:49:20 PM UTC+1, Daniel Renshaw wrote:
>>>>>>>
>>>>>>> Are you using, and modifying, a standard piece of code? If so, can 
>>>>>>> you please provide a reference so we can see the context of your 
>>>>>>> question? 
>>>>>>> If not, you'll need to share more of the code so we can see what's 
>>>>>>> going on.
>>>>>>>
>>>>>>> Daniel
>>>>>>>
>>>>>>>
>>>>>>> On 16 July 2015 at 12:37, Abhishek Shivkumar <[email protected]> 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>>
>>>>>>>>   I want to be able to switch the dropout variable to True and 
>>>>>>>> False during training and validation respectively. As of now, the way 
>>>>>>>> I am 
>>>>>>>> doing it is defining a variable 
>>>>>>>>
>>>>>>>>     
>>>>>>>>
>>>>>>>> self.phase = 0
>>>>>>>>
>>>>>>>>
>>>>>>>> inside a call and then just before I call the validate( ) method, I 
>>>>>>>> set this variable to 1. I am not sure if this is the right way to do 
>>>>>>>> it as 
>>>>>>>> it doesn't seem to have any effect on the results when I don't switch 
>>>>>>>> the 
>>>>>>>> dropout during validation at all. So, I think there should be a better 
>>>>>>>> way 
>>>>>>>> to do this.
>>>>>>>>
>>>>>>>>
>>>>>>>> Can someone please recommend what is the right way to take control 
>>>>>>>> of this variable before I call the train ( ) method and just before I 
>>>>>>>> call 
>>>>>>>> the validate ( ) method please?
>>>>>>>>
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>>
>>>>>>>> Abhishek S
>>>>>>>>
>>>>>>>> -- 
>>>>>>>>
>>>>>>>> --- 
>>>>>>>> You received this message because you are subscribed to the Google 
>>>>>>>> Groups "theano-users" group.
>>>>>>>> To unsubscribe from this group and stop receiving emails from it, 
>>>>>>>> send an email to [email protected].
>>>>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>>>>
>>>>>>>
>>>>>>> -- 
>>>>>
>>>>> --- 
>>>>> You received this message because you are subscribed to the Google 
>>>>> Groups "theano-users" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>>> an email to [email protected].
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>>
>>>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to