As part of work to university, I have to implement a training procedure which, fundamentally, trains layer-by-layer an MLP based on the measure of correntropy between the input and output of a given layer.
I have sucessfully found some pieces of code related to correntropy in https://github.com/pdoren/DeepEnsemble/blob/master/deepensemble/utils/utils_functions.py#L238-L264 and https://github.com/pdoren/DeepEnsemble/blob/master/deepensemble/utils/cost_functions.py#L210-L237. However, it is only possible to use this code if the samples have the same size. So, my question is: how can I compute the correntropy between the input and output of an MLP layer in Theano? -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
