Hi all gurus,

I have a few general questions about using neural networks function, i.e. the 
nnet function. I'm new to this function and still exploring it. So please 
kindly bear with me.

Here are my questions.

1. Is there anyway that I can specify my own objective or loss function for my 
neural networks? I see that by arguments that we can pass into the networks, we 
have either square loss or entropy. I'm trying to create an NN that fits 
ordinal regression data. So my objective function is the likelihood function 
but is not exactly the entropy function due to the fact that order of the 
labels matters in my application. Or I'm better of to write my own function?

2. If I need to write my own function, I could see that I can use constOptim to 
help me maximize my likelihood. Another question here is how can I pass 
analytical gradient function of each of my network weight into the function? 
Specifically, my question is that does my gradient function has to be flat 
representation of each parameter like the following arbitrary example:

2*w_i1_h1+3*w_h1_o1+4 (where w_i1_h1 is weight of input node 1 to hidden layer 
node 1, and vice versa)

Or I can do a nest expression like:

sigmoid(node_eval(input_x, j)) 

here my node_eval is a function that takes in input_x feature value vector and 
produces output to node j in the next layer. 

If expression base evaluation like in case 2 is possible, it will significantly 
simplify the definition of my gradient function output. 

Any help would be really appreciated on these issues. Thank you.

- adschai

______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to