Hello:
I have been trying to program the following maximization problem and would
definitely welcome some help.

the target function: sum_{i} f(alpha, beta'X_{i}),
                     where alpha and beta are unknown d-dim parameter,
                     f is a known function an X_{i} are i.i.d. r.v.
I need to maximize the above sum, under the constaint that:
                     beta'X_{i}+alpha<=1, for i=1,...,n.

For one dimension, it is kind of trivial. What should I do with high
dimensional alpha and beta?  Thanks for your time,

Shuangge Ma, Ph.D.

______________________________________________
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to