Hi Frederic, Thanks a lot for your response. It now produces similar results to the case where softmax is computed outside the step function. When I tried to provide some results for here, I realized that I had made a silly mistake in the evaluation part. Sorry for spamming, and thanks again.
Cheers, Mohammad. On Wednesday, November 9, 2016 at 12:09:12 AM UTC+10:30, nouiz wrote: > > If you want a chase to get more help, tell what you expected and what you > got. > > Fred > > Le 8 nov. 2016 05:53, "Mohammad Najafi" <[email protected] <javascript:>> > a écrit : > >> Hi, >> >> I want the softmax of the RNN output, to be computed within the step >> function, like below: >> >> >> def step(x_t, h_tm1): >> h_t = self.activation(T.dot(x_t, self.W_in) + T.dot(h_tm1, self.W) + >> self.bh) >> y_t = T.dot(h_t, self.W_out) + self.by >> y_t = T.nnet.softmax(y_t) >> return h_t, y_t >> >> >> However I do not get expected results and I am not sure where the problem >> is. >> Can anybody give me a hint why this method does not work? >> >> This is the scan function: >> >> >> [self.h, self.y_pred], _ = theano.scan(step, >> sequences=self.input, >> outputs_info=[T.alloc(self.h0, self.input.shape[1], >> n_hidden), None]) >> >> >> >> and I directly pass self.y_pred to p_y_given_x: >> >> >> self.p_y_given_x = self.y_pred >> >> >> >> Thanks in advance. >> >> -- >> >> --- >> You received this message because you are subscribed to the Google Groups >> "theano-users" group. >> To unsubscribe from this group and stop receiving emails from it, send an >> email to [email protected] <javascript:>. >> For more options, visit https://groups.google.com/d/optout. >> > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
