Hi Robb,

In your code you have:

# Custom softmax function to support fully convolutional networks
def softmax(x):
       e_x = theano.tensor.exp(x - x.max(axis=1, keepdims=True))
      return e_x / e_x.sum(axis=1, keepdims=True)


Isn't this the custom softmax you are referring to? 

Or are you referring to the loss function? I have the following:

loss =lasagne.objective.categorical_crossentropy(prediction,target)

I could possibly flatten the inputs like this:

loss =lasagne.objective.categorical_crossentropy(prediction.flatten(),target
.flatten())


 But I don't think that it is going to change something.


By the way you are right, the data is from ISBI challenge on dental 
challenge for detection of caries in bitewing radiography
 : http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/challenge2/index.html

Thanks for the help.

-Alex


On Saturday, July 30, 2016 at 4:46:26 PM UTC-7, Robb Brown wrote:
>
> You could add a two-element (convolutional) softmax layer on the output to 
> get a pair of class probabilities. Taking the argmax of that would give you 
> the predicted class. I just use cross entropy on the two class unets and it 
> seems to work fine. I think lasagna has some other options built in if 
> you're using that. 
>
> To do a convolutional softmax layer you'll need to write a custom softmax 
> function that can handle the two spatial dimensions. It's a one liner 
> though, easy to write from the equation given in the theano documentation. 
>
> I assume the challenge is to identify the teeth?
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to