Here's the definition of grad() function inside Op class.

    def grad(self, inputs, grads):
        x, = inputs
        gz, = grads
#        rval = theano.tensor.as_tensor_variable(gz *
(1-numpy.power(theano.tensor.tanh(x), 2)))  # Doesn't work
#        rval = gz * math.tanh(float(x))   # Doesn't work
#        rval = 2 * numpy.power(x, 2)    # Works
        rval = gz * theano.tensor.tanh(x)   # Doesn't work
        return [rval]


And here's the full backtrace:

*Traceback (most recent call last):*

  File "<ipython-input-23-cb484da32904>", line 1, in <module>

runfile('/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/nnet/step.py',
wdir='/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/nnet')

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/spyderlib/widgets/externalshell/sitecustomize.py",
line 699, in runfile
    execfile(filename, namespace)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/spyderlib/widgets/externalshell/sitecustomize.py",
line 88, in execfile
    exec(compile(open(filename, 'rb').read(), filename, 'exec'), namespace)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/nnet/step.py",
line 168, in <module>
    test_VectorBinarization()

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/nnet/step.py",
line 163, in test_VectorBinarization
    test_vector_binarize.test_grad()

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/nnet/step.py",
line 149, in test_grad
    verify_grad(binarize, [numpy.random.rand(5, 7, 2)])

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tests/unittest_tools.py",
line 91, in verify_grad
    T.verify_grad(op, pt, n_tests, rng, *args, **kwargs)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gradient.py",
line 1695, in verify_grad
    disconnected_inputs='ignore')

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gradient.py",
line 553, in grad
    grad_dict, wrt, cost_name)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gradient.py",
line 1317, in _populate_grad_dict
    rval = [access_grad_cache(elem) for elem in wrt]

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gradient.py",
line 1317, in <listcomp>
    rval = [access_grad_cache(elem) for elem in wrt]

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gradient.py",
line 1272, in access_grad_cache
    term = access_term_cache(node)[idx]

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gradient.py",
line 1106, in access_term_cache
    new_output_grads)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gof/op.py", line
700, in L_op
    return self.grad(inputs, output_grads)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 671, in grad
    rval = self._bgrad(inputs, ograds)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 786, in _bgrad
    ret.append(transform(scalar_igrad))

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 778, in transform
    *[transform(ipt) for ipt in node.inputs])

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 778, in <listcomp>
    *[transform(ipt) for ipt in node.inputs])

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 778, in transform
    *[transform(ipt) for ipt in node.inputs])

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/gof/op.py", line
604, in __call__
    node = self.make_node(*inputs, **kwargs)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 586, in make_node
    DimShuffle, *inputs)

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
line 528, in get_output_info
    for i in inputs])

  File
"/Users/mrins/anaconda/lib/python3.4/site-packages/theano/tensor/basic.py",
line 1200, in make_node
    *assert isinstance(t.type, TensorType)*

AssertionError


Regards,
Mrinmoy

On Wed, Oct 26, 2016 at 6:50 PM, Frédéric Bastien <
[email protected]> wrote:

> Can you give the full stack trace and your grad() method? There is
> something strange. The Theano variable X seem malformed.
>
> The problem could also be in your make node that build a bad output
> variable?
>
> Fred
>
> On Wed, Oct 26, 2016 at 5:47 PM, mrinmoy maity <[email protected]> wrote:
>
>>
>> I am trying experiment with a new Op in Theano. While defining the grad()
>> method, a function f(theano.tensor.tanh(x)) is used where x is the input.
>> However, internally its hitting an assert here:
>>
>>
>>   File "~/anaconda/lib/python3.4/site-packages/theano/tensor/basic.py",
>> line 1198, in make_node
>>     assert isinstance(t.type, TensorType)
>>
>>
>> The partial backtrace is given here:
>>
>>
>>   File "~/anaconda/lib/python3.4/site-packages/theano/gof/op.py", line
>> 604, in __call__
>>     node = self.make_node(*inputs, **kwargs)
>>
>>   File "~/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
>> line 586, in make_node
>>     DimShuffle, *inputs)
>>
>>   File "~/anaconda/lib/python3.4/site-packages/theano/tensor/elemwise.py",
>> line 528, in get_output_info
>>     for i in inputs])
>>
>>   File "~/anaconda/lib/python3.4/site-packages/theano/tensor/basic.py",
>> line 1198, in make_node
>>     assert isinstance(t.type, TensorType)
>>
>> AssertionError
>>
>>
>> The t.type here is 'float64' instead of a Tensortype. The issue is easily
>> reproducible with using tanh inside grad().
>> Note that I'm not using 'tanh' op here, rather using tanh in grad(). Also
>> encapsulating return from grad() using theano.tensor.as_tensor_variable()
>> doesn't work here.
>>
>>
>> Please let me  know if there's a workaround for this.
>>
>>
>> Regards,
>> Mrinmoy
>>
>>
>> --
>>
>> ---
>> You received this message because you are subscribed to the Google Groups
>> "theano-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> For more options, visit https://groups.google.com/d/optout.
>>
>
> --
>
> ---
> You received this message because you are subscribed to a topic in the
> Google Groups "theano-users" group.
> To unsubscribe from this topic, visit https://groups.google.com/d/
> topic/theano-users/fK_50U2yxDE/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to