Hi Patric, 
Yes, I tried to define a new op and it works. But I don't know why the 
speed become much slower....

class round_2(theano.Op):
    __props__ = ()

    def make_node(self, x):
        x = theano.tensor.as_tensor_variable(x)
        # Note: using x_.type() is dangerous, as it copies x's broadcasting
        # behaviour
        return theano.Apply(self, [x], [x.type()])

    def perform(self, node, inputs, output_storage):
        x = inputs[0]
        z = output_storage[0]
        z[0] = np.round(x)

    def infer_shape(self, node, i0_shapes):
        return i0_shapes

    def grad(self, inputs, output_grads):
        return [output_grads[0]]

    def R_op(self, inputs, eval_points):
        # R_op can receive None as eval_points.
        # That mean there is no diferientiable path through that input
        # If this imply that you cannot compute some outputs,
        # return None for those.
        if eval_points[0] is None:
            return eval_points
        return self.grad(inputs, eval_points) 






On Monday, 24 April 2017 13:48:37 UTC+9:30, Patric wrote:
>
> Does the below link help?
>
>
> http://stackoverflow.com/questions/40613225/theano-how-to-override-gradient-for-part-of-op-graph
>
>
>
> On Sunday, April 23, 2017 at 3:27:49 PM UTC+8, Bohan Zhuang wrote:
>>
>> Recently, I found a piece of code written in Tensorflow. It serves to 
>> overriding tf.round()'s gradient function with tf.identity()'s using 
>> gradient_override_map function:
>>
>> import tensorflow as tf
>> G = tf.get_default_graph()
>> def quantize(x, k):
>>         n = float(2**k - 1)
>>         with G.gradient_override_map({"Round": "Identity"}):
>>             return tf.round(x * n) / n
>>
>> And I want to implement it in Theano but have no idea....So could anyone 
>> provide some advice on that? 
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to