On Fri, Dec 13, 2019 at 09:12:54AM +0100, Filippo Portera wrote: > Hello, > I've posted a question on StackOverflow: > > https://stackoverflow.com/questions/59237562/a-generalized-quadratic-loss-for-deep-neural-network-for-multi-class-classificat > > Do you think it should be possible to write a loss of this kind within > MLPack? > If any development comes around, I'm interested in co-authors of an > eventual paper about it.
Hey Filippo, I think that should be possible. If I am understanding right you would just need to implement a new loss layer and its gradient. Take a look at NegativeLogLikelihood<> and other loss layers in src/mlpack/methods/ann/layer/loss_functions/ for an example; I think you can adapt some of those. Hope this helps! Thanks, Ryan -- Ryan Curtin | "Hungry." [email protected] | - Sphinx _______________________________________________ mlpack mailing list [email protected] http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
