On Mon, Oct 18, 2010 at 7:05 PM, MM <[email protected]> wrote:
> Hi,
> I have a deterministic function (computer code) of a couple of hundred
> variables (the maximum size of the problem under normal conditions).
>  Each evaluation takes about 50ms. It is not very expensive.
> Of the 200 variables, 40 are real numbers between 0 and 1 (actually 39 and
> the 40th is just 1 - sum of 39 others), another 20 also reals between 0 and
> 0.5..
> The rest are strictly positive integers between 1 and 500 (for the largest
> upper bound)
> I currently have a brute force solution which generates enumerates all the
> space based on user input. I estimate a reasonable run though to take 1

500^140 * 50ms is 10^369 years. So I think you're omitting something
about your problem space, or you've calculated something wrong.

If your combinatorial integer part is big (like 500^140) and
non-smooth then your problem is not globally optimizable.

I recently worked on a problem with a 46d mostly-smooth real part and
46d integer part (about 552 bits of integer space in total). Even
though the objective function was very fast this problem would not be
directly solvable.

The problem was was, however, decomposable into a 46d real problem
containing 23 2d integer sub-problems (with about 2^24 states for each
sub-problem).  I just made my objective function solve the 23 integer
sub-problems exhaustively at each optimization step and used NLopt to
search the 46d space. It worked very nicely.

Perhaps your problem is agreeable to this kind of subdivision.
Otherwise you can probably only hope to search a regular grid of the
possible integer space and the do only local optimization.  The 3^140
operations to search the unit-hypercube around even a single integer
grid point is too computationally expensive to do even with a very
fast objective function.

_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to