Thanks. So the output of my network has the following shape : (:, #Classes, 
Image_Width, Image_Height). Fortunately the categorical cross entropy in 
Lasagne supports multiple classes. So in my loss function:

 
loss =lasagne.objective.categorical_crossentropy(prediction,target)


-prediction has shape (batch_size, 7, 512, 512) where #Classes = 7 
, Image_Width = 512, Image_Height=512
-target has also the same shape

Now the "issue" is that the prediction has values of 0-1 for each class 
(i.e. prediction[0][1]... prediction[0][7]) and it is not binary. That's 
normal since there is a softmax. Applying  argmax across all classes (i.e. 
axis 1):

np.argmax(prediction,axis=1)

will give me one image of the maximum values across all classes. That's not 
what it should. Basically one wants an binary image per class. 

So I am wondering if thresholding on each class is the way to go. I was 
playing with a threshold of mean value (i.e image > mean). It gives me some 
meaningful results but again I am not sure if that's totally correct.

-Alex

On Saturday, July 30, 2016 at 6:36:47 PM UTC-7, Robb Brown wrote:
>
> That's the one. You have to run the output of your convolution through 
> something like that function to implement the softmax
>
> The cost function also needs a bit of special treatment when your final 
> layer is convolutional. Basically you need to reshape and shuffle 
> dimensions so that everything lines up with what your cost function expects.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to