The issue is that you are using a derivative based optimizer for a problem for which it is well known that such optimizers will not perform well. You should consider using a global optimizer. For example, "rgenoud" combines a genetic search algorithm with a BFGS optimizer and it works well for your problem:
library(rgenoud) myfunc <- function(x) { x1 <- x[1] x2 <- x[2] abs(x1-x2) } optim(c(0.5,0.5),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control=list(fnscale=-1)) genoud(myfunc, nvars=2, Domains=rbind(c(0,1),c(0,1)),max=TRUE,boundary.enforcement=2) myfunc <- function(x) { x1 <- x[1] x2 <- x[2] (x1-x2)^2 } optim(c(0.2,0.2),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control=list(fnscale=-1)) genoud(myfunc, nvars=2, Domains=rbind(c(0,1),c(0,1)),max=TRUE,boundary.enforcement=2) Cheers, Jas. ======================================= Jasjeet S. Sekhon Associate Professor Travers Department of Political Science Survey Research Center UC Berkeley http://sekhon.berkeley.edu/ V: 510-642-9974 F: 617-507-5524 ======================================= Paul Smith writes: > It seems that there is here a problem of reliability, as one never > knows whether the solution provided by R is correct or not. In the > case that I reported, it is fairly simple to see that the solution > provided by R (without any warning!) is incorrect, but, in general, > that is not so simple and one may take a wrong solution as a correct > one. > > Paul > > > On 5/8/07, Ravi Varadhan <[EMAIL PROTECTED]> wrote: > > Your function, (x1-x2)^2, has zero gradient at all the starting values such > > that x1 = x2, which means that the gradient-based search methods will > > terminate there because they have found a critical point, i.e. a point at > > which the gradient is zero (which can be a maximum or a minimum or a saddle > > point). > > > > However, I do not why optim converges to the boundary maximum, when > > analytic > > gradient is supplied (as shown by Sundar). > > > > Ravi. > > > > ---------------------------------------------------------------------------- > > ------- > > > > Ravi Varadhan, Ph.D. > > > > Assistant Professor, The Center on Aging and Health > > > > Division of Geriatric Medicine and Gerontology > > > > Johns Hopkins University > > > > Ph: (410) 502-2619 > > > > Fax: (410) 614-9625 > > > > Email: [EMAIL PROTECTED] > > > > Webpage: http://www.jhsph.edu/agingandhealth/People/Faculty/Varadhan.html > > > > > > > > ---------------------------------------------------------------------------- > > -------- > > > > > > -----Original Message----- > > From: [EMAIL PROTECTED] > > [mailto:[EMAIL PROTECTED] On Behalf Of Paul Smith > > Sent: Monday, May 07, 2007 6:26 PM > > To: R-help > > Subject: Re: [R] Bad optimization solution > > > > On 5/7/07, Paul Smith <[EMAIL PROTECTED]> wrote: > > > > I think the problem is the starting point. I do not remember the > > details > > > > of the BFGS method, but I am almost sure the (.5, .5) starting point is > > > > suspect, since the abs function is not differentiable at 0. If you > > perturb > > > > the starting point even slightly you will have no problem. > > > > > > > > "Paul Smith" > > > > <[EMAIL PROTECTED] > > > > > > > To > > > > Sent by: R-help > > > > <r-help@stat.math.ethz.ch> > > > > [EMAIL PROTECTED] > > cc > > > > at.math.ethz.ch > > > > > > Subject > > > > [R] Bad optimization solution > > > > 05/07/2007 04:30 > > > > PM > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Dear All > > > > > > > > I am trying to perform the below optimization problem, but getting > > > > (0.5,0.5) as optimal solution, which is wrong; the correct solution > > > > should be (1,0) or (0,1). > > > > > > > > Am I doing something wrong? I am using R 2.5.0 on Fedora Core 6 > > > > (Linux). > > > > > > > > Thanks in advance, > > > > > > > > Paul > > > > > > > > ------------------------------------------------------ > > > > myfunc <- function(x) { > > > > x1 <- x[1] > > > > x2 <- x[2] > > > > abs(x1-x2) > > > > } > > > > > > > > > > optim(c(0.5,0.5),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control= > > list(fnscale=-1)) > > > > > > Yes, with (0.2,0.9), a correct solution comes out. However, how can > > > one be sure in general that the solution obtained by optim is correct? > > > In ?optim says: > > > > > > Method '"L-BFGS-B"' is that of Byrd _et. al._ (1995) which allows > > > _box constraints_, that is each variable can be given a lower > > > and/or upper bound. The initial value must satisfy the > > > constraints. This uses a limited-memory modification of the BFGS > > > quasi-Newton method. If non-trivial bounds are supplied, this > > > method will be selected, with a warning. > > > > > > which only demands that "the initial value must satisfy the constraints". > > > > Furthermore, X^2 is everywhere differentiable and notwithstanding the > > reported problem occurs with > > > > myfunc <- function(x) { > > x1 <- x[1] > > x2 <- x[2] > > (x1-x2)^2 > > } > > > > optim(c(0.2,0.2),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control= > > list(fnscale=-1)) > > > > Paul > > > > ______________________________________________ > > R-help@stat.math.ethz.ch mailing list > > https://stat.ethz.ch/mailman/listinfo/r-help > > PLEASE do read the posting guide > > http://www.R-project.org/posting-guide.html > > and provide commented, minimal, self-contained, reproducible code. > > > > ______________________________________________ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.