Theano do not support the grad with complex number. It is complicated to implement and we don't need it. So it wasn't done.
If you want to implement it, there is a proposal of how to do it. There is some work around for some cases. If you implement the complex as an extra dimensions with shape 2 and use operators on float numbers, then you will get the first grad. Some peoples did that. Fred Fred Le 26 juil. 2016 23:42, <[email protected]> a écrit : > I have two [X,] matrices where X is some number, and I'd like to make a > loss function that will obtain their Pearson correlation. > > Code: > > def corr(a,b): > c = np.cov(a, b) > d = np.diag(c) > stddev = np.sqrt(d.real) > c /= stddev[:, None] > c /= stddev[None, :] > > return c > > x= T.matrix() #some input > y= T.matrix() #one of the two matrices > > loss = lasagne.layers.get_output([outputlayer], 1) #output from simple mlp > loss = corr(loss,y)[0][1] > loss = loss.mean() > > nnparameters = lasagne.layers.get_all_params([outputlayer], > trainable=True) #parameters from simple mlp > > grads = T.grad(loss, nnparameters) > > I get this error when I try to obtain grads: > > Traceback (most recent call last): > > File "<ipython-input-232-36dcc0f0d1d5>", line 1, in <module> > grads = T.grad(loss, nnparameters) > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 549, in grad > grad_dict, wrt, cost_name) > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1312, in _populate_grad_dict > rval = [access_grad_cache(elem) for elem in wrt] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1267, in access_grad_cache > term = access_term_cache(node)[idx] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 961, in access_term_cache > output_grads = [access_grad_cache(var) for var in node.outputs] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1267, in access_grad_cache > term = access_term_cache(node)[idx] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 961, in access_term_cache > output_grads = [access_grad_cache(var) for var in node.outputs] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1267, in access_grad_cache > term = access_term_cache(node)[idx] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 961, in access_term_cache > output_grads = [access_grad_cache(var) for var in node.outputs] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1267, in access_grad_cache > term = access_term_cache(node)[idx] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 961, in access_term_cache > output_grads = [access_grad_cache(var) for var in node.outputs] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1267, in access_grad_cache > term = access_term_cache(node)[idx] > > File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line > 1101, in access_term_cache > input_grads = node.op.grad(inputs, new_output_grads) > > File "/usr/local/lib/python2.7/dist-packages/theano/tensor/elemwise.py", > line 698, in grad > rval = self._bgrad(inputs, ograds) > > File "/usr/local/lib/python2.7/dist-packages/theano/tensor/elemwise.py", > line 773, in _bgrad > scalar_igrads = self.scalar_op.grad(scalar_inputs, scalar_ograds) > > File "/usr/local/lib/python2.7/dist-packages/theano/scalar/basic.py", > line 908, in grad > self.__class__.__name__) > > MethodNotDefined: ('grad', <class 'theano.scalar.basic.Conj'>, 'Conj') > > > Would anyone know how to solve this error? > > Thanks > > -- > > --- > You received this message because you are subscribed to the Google Groups > "theano-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > For more options, visit https://groups.google.com/d/optout. > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
