Re: [R] any more direct-search optimization method in R

2006-03-01 Thread Jasjeet Singh Sekhon

 Given that information, I think a genetic algorithm
 should probably do well with your problem. 

You may want to try the rgenoud package (R-GENetic Optimization Using
Derivatives) which is on CRAN.  For more information see:

http://sekhon.berkeley.edu/rgenoud/

It works well for these kinds of problems and the package is
relatively flexible. For time consuming problems it can be run in
parallel if you have multiple cores/cpus or machines.

Cheers,
JS.

===
Jasjeet S. Sekhon 
  
Associate Professor 
Survey Research Center  
UC Berkeley 

http://sekhon.berkeley.edu/
V: 510-642-9974  F: 617-507-5524
===



 Given that information, I think a genetic algorithm
 should probably do well with your problem.  Standard
 derivative-based optimizers are going to get frustrated
 and give up.  I can believe that Nelder-Mead could
 get confused as well, though I'm not sure that it will.
 
 'genopt' from S Poetry does have box constraints for
 the parameters.  I'm not sure what other genetic algorithms
 that are in R are like.
 
 Patrick Burns
 [EMAIL PROTECTED]
 +44 (0)20 8525 0696
 http://www.burns-stat.com
 (home of S Poetry and A Guide for the Unwilling S User)
 
 Weijie Cai wrote:
 
 Hi All,
 Thanks for all your replies especially for Graves suggestions. You are right 
 I should give more information about my function. So my responds to your 
 questions are:
 1. 2. the function itself is not continuous/smooth. The evaluation at each 
 point is a random number with a non-constant variance. When it approaches 
 the global minimum, the variance is very small. There is some kind of 
 structure from the surface plot of my function but its form is intractable, 
 unfortunately.
 
 3. 4. each evaluation of my function is not slow. The returned results by 
 constrOptim() are just not quite close to true global minimum (error can be 
 as large as 0.2). Of course I can ignore the message of nonconvergence, the 
 precision is really not satisfying. Every time nelder-mead will use up 300 
 default iterations when doing optimization. I guess the essential reason is 
 the randomness of function surface.
 
 5. Yes I am sure there is a global minimum. I did a lengthy computation at 
 rough grids and global minimum is very close to true minimum.
 
 6. Do you mean I start from a minimum found by grid searching? That's what 
 I did. I never tried using smooth functions to approximate my function 
 though.
 
 WC
 
 
   
 
 From: Spencer Graves [EMAIL PROTECTED]
 To: Ingmar Visser [EMAIL PROTECTED]
 CC: Weijie Cai [EMAIL PROTECTED], r-help@stat.math.ethz.ch
 Subject: Re: [R] any more direct-search optimization method in R
 Date: Tue, 28 Feb 2006 09:33:35 -0800
 
 WC:
 
   What do you mean by noisy in this context?
 
   1.  You say, gradient, hessian not available.  Is it continuous 
  with 
 perhaps discontinuities in the first derivative?
 
   2.  Or is it something you can compute only to, say, 5 significant 
 digits, and some numerical optimizers get lost trying to estimate 
 derivatives from so fine a grid that the gradient and hessian are mostly 
 noise?
 
   3.  Also, why do you think constrOptim is too slow?  Does it call 
  your 
 function too many times or does your function take too long to compute each 
 time it's called?
 
   4.  What's not satisfactory about the results of constrOptim?
 
   5.  Do you know if only one it has only one local minimum in the 
  region, 
 or might it have more?
 
   6.  Regardless of the answers to the above, have you considered using 
 expand.grid to get starting values and narrow the search (with possibly 
 system.time or proc.time to find out how much time is required for each 
 function evaluation)?  I haven't tried this, but I would think it would be 
 possible to fit a spline (either exactly or a smoothing spline) to a set of 
 points, then optimize the spline.
 
   hope this helps.
   spencer graves
 
 Ingmar Visser wrote:
 
 
 
 If you have only boundary constraints on parameters you can use method
 L-BFGS in optim.
 Hth, ingmar
 
 
 
   
 
 From: Weijie Cai [EMAIL PROTECTED]
 Date: Tue, 28 Feb 2006 11:48:32 -0500
 To: r-help@stat.math.ethz.ch
 Subject: [R] any more direct-search optimization method in R
 
 Hello list,
 
 I am dealing with a noisy function (gradient,hessian not available) with
 simple boundary constraints (x_i0). I've tried constrOptim() using 
 nelder
 mead to minimize it but it is way too slow and the returned results are 
 not
 satisfying. simulated annealing is so hard to tune and it always crashes 
 R
 program in my case. I wonder if there are any packages or functions can 
 do
 direct search optimization?
 
 A rough search in literature shows multidirectional search and DIRECT
 algorithm may help. Is there any other satisfying algorithm?
 
 Thanks,
 WC

Re: [R] any more direct-search optimization method in R

2006-02-28 Thread Ingmar Visser
If you have only boundary constraints on parameters you can use method
L-BFGS in optim.
Hth, ingmar


 From: Weijie Cai [EMAIL PROTECTED]
 Date: Tue, 28 Feb 2006 11:48:32 -0500
 To: r-help@stat.math.ethz.ch
 Subject: [R] any more direct-search optimization method in R
 
 Hello list,
 
 I am dealing with a noisy function (gradient,hessian not available) with
 simple boundary constraints (x_i0). I've tried constrOptim() using nelder
 mead to minimize it but it is way too slow and the returned results are not
 satisfying. simulated annealing is so hard to tune and it always crashes R
 program in my case. I wonder if there are any packages or functions can do
 direct search optimization?
 
 A rough search in literature shows multidirectional search and DIRECT
 algorithm may help. Is there any other satisfying algorithm?
 
 Thanks,
 WC
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] any more direct-search optimization method in R

2006-02-28 Thread Ben Bolker
Ingmar Visser I.Visser at uva.nl writes:

 
 If you have only boundary constraints on parameters you can use method
 L-BFGS in optim.
 Hth, ingmar
 
  From: Weijie Cai wcai11 at hotmail.com

  
  I am dealing with a noisy function (gradient,hessian not available) with
  simple boundary constraints (x_i0). I've tried constrOptim() using nelder
  mead to minimize it but it is way too slow and the returned results are not
  satisfying. simulated annealing is so hard to tune and it always crashes R
  program in my case. I wonder if there are any packages or functions can do
  direct search optimization?
  

   Noisy functions are really challenging to optimize; (1) there is no
best method (despite all the papers doing comparisons of stochastic
global optimizers on various sets of test functions); (2) the fancier
methods are hard to program [and existing implementations tend have more
restricted licenses]; (3) they tend to be slow (thousands of function
evaluations).   Packages on CRAN that *might* be helpful are
genalg, DEoptim.
   A typical poor man's approach to boundary constraints is to
add a quadratic penalty (perhaps not even trying to evaluate the
objective function -- e.g. substituting the value at the closest
boundary point) for parameters outside the constraints into
the objective function.
   With more information (number of parameters, time to compute a
single function evaluation, kind of noise) we might be able to help
more.
  
  Ben Bolker

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] any more direct-search optimization method in R

2006-02-28 Thread Spencer Graves
WC:

  What do you mean by noisy in this context?

  1.  You say, gradient, hessian not available.  Is it continuous 
with perhaps discontinuities in the first derivative?

  2.  Or is it something you can compute only to, say, 5 significant 
digits, and some numerical optimizers get lost trying to estimate 
derivatives from so fine a grid that the gradient and hessian are mostly 
noise?

  3.  Also, why do you think constrOptim is too slow?  Does it call 
your function too many times or does your function take too long to 
compute each time it's called?

  4.  What's not satisfactory about the results of constrOptim?

  5.  Do you know if only one it has only one local minimum in the 
region, or might it have more?

  6.  Regardless of the answers to the above, have you considered using 
expand.grid to get starting values and narrow the search (with 
possibly system.time or proc.time to find out how much time is required 
for each function evaluation)?  I haven't tried this, but I would think 
it would be possible to fit a spline (either exactly or a smoothing 
spline) to a set of points, then optimize the spline.

  hope this helps.
  spencer graves

Ingmar Visser wrote:

 If you have only boundary constraints on parameters you can use method
 L-BFGS in optim.
 Hth, ingmar
 
 
 
From: Weijie Cai [EMAIL PROTECTED]
Date: Tue, 28 Feb 2006 11:48:32 -0500
To: r-help@stat.math.ethz.ch
Subject: [R] any more direct-search optimization method in R

Hello list,

I am dealing with a noisy function (gradient,hessian not available) with
simple boundary constraints (x_i0). I've tried constrOptim() using nelder
mead to minimize it but it is way too slow and the returned results are not
satisfying. simulated annealing is so hard to tune and it always crashes R
program in my case. I wonder if there are any packages or functions can do
direct search optimization?

A rough search in literature shows multidirectional search and DIRECT
algorithm may help. Is there any other satisfying algorithm?

Thanks,
WC

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
 
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] any more direct-search optimization method in R

2006-02-28 Thread Ben Bolker

Mathematica may well have good optimization routines; I know MATLAB
does (e.g. the optimization toolbox, 
http://www.mathworks.com/products/optimization/?BB=1 , has a 
general-constraints nonlinear optimizer) -- I also think
more people develop optimization code in MATLAB because of its use in
engineering circles.  (The DIRECT algorithm is implemented in
TOMLAB, a MATLAB add-on package from a third party.)
(S+NuOPT is also available from Insightful,
although it doesn't look like it does global stochastic stuff;
our friends at Otter Research have ADMB, although I don't
know how well it handles noisy objective functions.)

Drawbacks: (1) switching platforms is a pain, (2) most of
these alternatives are
expensive [although if you have the money it may well be
worth it], (3) Mathematica is notoriously bad about documenting
its methods and giving references to the peer-reviewed literature.
If I were regularly doing really difficult optimization
problems I might switch to MATLAB+add-ons.

   I would like to see more optimization choices implemented in
R, but haven't gotten around to doing so myself yet.  For hard
optimization problems, however, it is nearly always true that
you have to know something about your problem and tune methods
accordingly (see Spencer Graves' message in this thread).

   Ben

 Original Message 
Subject: Re: [R] any more direct-search optimization method in R
Date: Tue, 28 Feb 2006 09:55:15 -0800 (PST)
From: Greg Tarpinian [EMAIL PROTECTED]
To: Ben Bolker [EMAIL PROTECTED]

This may not be the most helpful advice, but Mathematica is a wonderful
platform having many more built-in optimization routines than R.  I have
found the simulated annealing facility to be robust and much easier to
use than R's own facility.  This may be a case where switching to another
platform would be the easiest solution.

Kind regards,

 Greg



--- Ben Bolker [EMAIL PROTECTED] wrote:

 Ingmar Visser I.Visser at uva.nl writes:
 
  
  If you have only boundary constraints on parameters you can use method
  L-BFGS in optim.
  Hth, ingmar
  
   From: Weijie Cai wcai11 at hotmail.com
 
   
   I am dealing with a noisy function (gradient,hessian not available) with
   simple boundary constraints (x_i0). I've tried constrOptim() using
 nelder
   mead to minimize it but it is way too slow and the returned results are
 not
   satisfying. simulated annealing is so hard to tune and it always crashes
 R
   program in my case. I wonder if there are any packages or functions can
 do
   direct search optimization?
   
 
Noisy functions are really challenging to optimize; (1) there is no
 best method (despite all the papers doing comparisons of stochastic
 global optimizers on various sets of test functions); (2) the fancier
 methods are hard to program [and existing implementations tend have more
 restricted licenses]; (3) they tend to be slow (thousands of function
 evaluations).   Packages on CRAN that *might* be helpful are
 genalg, DEoptim.
A typical poor man's approach to boundary constraints is to
 add a quadratic penalty (perhaps not even trying to evaluate the
 objective function -- e.g. substituting the value at the closest
 boundary point) for parameters outside the constraints into
 the objective function.
With more information (number of parameters, time to compute a
 single function evaluation, kind of noise) we might be able to help
 more.
   
   Ben Bolker
 
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
 


__




-- 
620B Bartram Hall[EMAIL PROTECTED]
Zoology Department, University of Floridahttp://www.zoo.ufl.edu/bolker
Box 118525   (ph)  352-392-5697
Gainesville, FL 32611-8525   (fax) 352-392-3704

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] any more direct-search optimization method in R

2006-02-28 Thread Weijie Cai
Hi All,
Thanks for all your replies especially for Graves suggestions. You are right 
I should give more information about my function. So my responds to your 
questions are:
1. 2. the function itself is not continuous/smooth. The evaluation at each 
point is a random number with a non-constant variance. When it approaches 
the global minimum, the variance is very small. There is some kind of 
structure from the surface plot of my function but its form is intractable, 
unfortunately.

3. 4. each evaluation of my function is not slow. The returned results by 
constrOptim() are just not quite close to true global minimum (error can be 
as large as 0.2). Of course I can ignore the message of nonconvergence, the 
precision is really not satisfying. Every time nelder-mead will use up 300 
default iterations when doing optimization. I guess the essential reason is 
the randomness of function surface.

5. Yes I am sure there is a global minimum. I did a lengthy computation at 
rough grids and global minimum is very close to true minimum.

6. Do you mean I start from a minimum found by grid searching? That's what 
I did. I never tried using smooth functions to approximate my function 
though.

WC


From: Spencer Graves [EMAIL PROTECTED]
To: Ingmar Visser [EMAIL PROTECTED]
CC: Weijie Cai [EMAIL PROTECTED], r-help@stat.math.ethz.ch
Subject: Re: [R] any more direct-search optimization method in R
Date: Tue, 28 Feb 2006 09:33:35 -0800

WC:

 What do you mean by noisy in this context?

 1.  You say, gradient, hessian not available.  Is it continuous 
 with 
perhaps discontinuities in the first derivative?

 2.  Or is it something you can compute only to, say, 5 significant 
digits, and some numerical optimizers get lost trying to estimate 
derivatives from so fine a grid that the gradient and hessian are mostly 
noise?

 3.  Also, why do you think constrOptim is too slow?  Does it call 
 your 
function too many times or does your function take too long to compute each 
time it's called?

 4.  What's not satisfactory about the results of constrOptim?

 5.  Do you know if only one it has only one local minimum in the 
 region, 
or might it have more?

 6.  Regardless of the answers to the above, have you considered using 
expand.grid to get starting values and narrow the search (with possibly 
system.time or proc.time to find out how much time is required for each 
function evaluation)?  I haven't tried this, but I would think it would be 
possible to fit a spline (either exactly or a smoothing spline) to a set of 
points, then optimize the spline.

 hope this helps.
 spencer graves

Ingmar Visser wrote:

If you have only boundary constraints on parameters you can use method
L-BFGS in optim.
Hth, ingmar



From: Weijie Cai [EMAIL PROTECTED]
Date: Tue, 28 Feb 2006 11:48:32 -0500
To: r-help@stat.math.ethz.ch
Subject: [R] any more direct-search optimization method in R

Hello list,

I am dealing with a noisy function (gradient,hessian not available) with
simple boundary constraints (x_i0). I've tried constrOptim() using 
nelder
mead to minimize it but it is way too slow and the returned results are 
not
satisfying. simulated annealing is so hard to tune and it always crashes 
R
program in my case. I wonder if there are any packages or functions can 
do
direct search optimization?

A rough search in literature shows multidirectional search and DIRECT
algorithm may help. Is there any other satisfying algorithm?

Thanks,
WC

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html


__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] any more direct-search optimization method in R

2006-02-28 Thread Patrick Burns
Given that information, I think a genetic algorithm
should probably do well with your problem.  Standard
derivative-based optimizers are going to get frustrated
and give up.  I can believe that Nelder-Mead could
get confused as well, though I'm not sure that it will.

'genopt' from S Poetry does have box constraints for
the parameters.  I'm not sure what other genetic algorithms
that are in R are like.

Patrick Burns
[EMAIL PROTECTED]
+44 (0)20 8525 0696
http://www.burns-stat.com
(home of S Poetry and A Guide for the Unwilling S User)

Weijie Cai wrote:

Hi All,
Thanks for all your replies especially for Graves suggestions. You are right 
I should give more information about my function. So my responds to your 
questions are:
1. 2. the function itself is not continuous/smooth. The evaluation at each 
point is a random number with a non-constant variance. When it approaches 
the global minimum, the variance is very small. There is some kind of 
structure from the surface plot of my function but its form is intractable, 
unfortunately.

3. 4. each evaluation of my function is not slow. The returned results by 
constrOptim() are just not quite close to true global minimum (error can be 
as large as 0.2). Of course I can ignore the message of nonconvergence, the 
precision is really not satisfying. Every time nelder-mead will use up 300 
default iterations when doing optimization. I guess the essential reason is 
the randomness of function surface.

5. Yes I am sure there is a global minimum. I did a lengthy computation at 
rough grids and global minimum is very close to true minimum.

6. Do you mean I start from a minimum found by grid searching? That's what 
I did. I never tried using smooth functions to approximate my function 
though.

WC


  

From: Spencer Graves [EMAIL PROTECTED]
To: Ingmar Visser [EMAIL PROTECTED]
CC: Weijie Cai [EMAIL PROTECTED], r-help@stat.math.ethz.ch
Subject: Re: [R] any more direct-search optimization method in R
Date: Tue, 28 Feb 2006 09:33:35 -0800

WC:

What do you mean by noisy in this context?

1.  You say, gradient, hessian not available.  Is it continuous 
 with 
perhaps discontinuities in the first derivative?

2.  Or is it something you can compute only to, say, 5 significant 
digits, and some numerical optimizers get lost trying to estimate 
derivatives from so fine a grid that the gradient and hessian are mostly 
noise?

3.  Also, why do you think constrOptim is too slow?  Does it call 
 your 
function too many times or does your function take too long to compute each 
time it's called?

4.  What's not satisfactory about the results of constrOptim?

5.  Do you know if only one it has only one local minimum in the 
 region, 
or might it have more?

6.  Regardless of the answers to the above, have you considered using 
expand.grid to get starting values and narrow the search (with possibly 
system.time or proc.time to find out how much time is required for each 
function evaluation)?  I haven't tried this, but I would think it would be 
possible to fit a spline (either exactly or a smoothing spline) to a set of 
points, then optimize the spline.

hope this helps.
spencer graves

Ingmar Visser wrote:



If you have only boundary constraints on parameters you can use method
L-BFGS in optim.
Hth, ingmar



  

From: Weijie Cai [EMAIL PROTECTED]
Date: Tue, 28 Feb 2006 11:48:32 -0500
To: r-help@stat.math.ethz.ch
Subject: [R] any more direct-search optimization method in R

Hello list,

I am dealing with a noisy function (gradient,hessian not available) with
simple boundary constraints (x_i0). I've tried constrOptim() using 
nelder
mead to minimize it but it is way too slow and the returned results are 
not
satisfying. simulated annealing is so hard to tune and it always crashes 
R
program in my case. I wonder if there are any packages or functions can 
do
direct search optimization?

A rough search in literature shows multidirectional search and DIRECT
algorithm may help. Is there any other satisfying algorithm?

Thanks,
WC

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html


__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html
  


__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html



  


__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html