*hello @nouiz* *..I read many posts of yours.. but still i could not map my problem accordingly. please help me. *
TypeError: ('An update must have the same type as the original shared variable (shared_var=1lstm1_U_rgrad2, shared_var.type=TensorType(float32, matrix), update_val=Elemwise{add,no_inplace}.0, update_val.type=TensorType(float64, matrix)).', 'If the difference is related to the broadcast pattern, you can call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to remove broadcastable dimensions.') the error appears on this line of code: f_grad_shared = theano.function([emb11, mask11, emb21, mask21, y], cost, updates=zgup + rg2up, name='adadelta_f_grad_shared',allow_input_downcast=True) whereas the created parameters that are passe dto this function are : y = tensor.vector('y', dtype='float32') mask11 = tensor.matrix('mask11', dtype='float32') mask21 = tensor.matrix('mask21', dtype='float32') emb11 = theano.tensor.ftensor3('emb11') emb21 = theano.tensor.ftensor3('emb21') trng = RandomStreams(1234) self.tnewp = init_tparams(newp) rate = 0.5 rrng = trng.binomial(emb11.shape, p=1 - rate, n=1, dtype=emb11.dtype) proj11 = getpl2(emb11, '1lstm1', mask11, False, rrng, 50, self.tnewp)[-1] proj21 = getpl2(emb21, '2lstm1', mask21, False, rrng, 50, self.tnewp)[-1] dif = (proj21 - proj11).norm(L=1, axis=1) s2 = T.exp(-dif) sim = T.clip(s2, 1e-7, 1.0 - 1e-7) lr = tensor.scalar(name='lr',dtype='float32') ys = T.clip((y - 1.0) / 4.0, 1e-7, 1.0 - 1e-7) cost = T.mean((sim - ys) ** 2) i checked for data types: cost.type() <TensorType(float64, scalar)> where as all others are in float32. now please tell me how can i deal with the issue. Its urgent . Regards, -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.