Hi Christian,

> Sorry for the delay but I did not really know what to answer ;-)

we also took some time, because we do not know (exactly) HOW to ask :-) 
Therefore we decided to slightly tidy up our current version of the code and 
put it in a bitbucket repository. You can find it at 
https://bitbucket.org/tunnuz/gecode-lns together with a small adaptation of the 
TSP example for LNS full of workarounds (including a patch to gecode/search.hh 
for granting friendship of BaseEngine to LNS as it was done with RBS).

Consider that some of the code is taken verbatim from the RBS engine (sometimes 
comments included), and adapted for LNS.

> So, maybe you could summarize:
> - what does the engine do

The engine is performing a Large Neighborhood Search, i.e., starting from a 
complete initial solution obtained by a "start_engine"(1), it will iteratively 
explore the neighborhood obtained by relaxing some of the variables of the 
current solution and using an engine to find a better new solution with only 
those variables changed. The strategy for relaxing the variables should be 
problem-specific (it will be implemented in a required addition to Space) and 
also the branching strategy for exploring that neighborhood (idem).

The abstract pseudo-code of LNS (1) is the following [it is really pseudo-code, 
all the technicalities for dealing with cloning, pointers and so on is taken 
care in the actual code]:

AbstractLNS()
{
  current->postBranchingForInitialSolution();
  current = start_engine->next(); // find initial solution 
  best = current;
  intensity = initialIntensity;
  while (!StopCondition) {
    neighbor = current->cloneAndRelax(intensity);
    neighbor->constrain(current, specificConstrainStrategy); // constrain 
neighborhoods to have a solution cost according to a specific constrain 
strategy (e.g., strictly better than current, or based on Simulated Annealing)
    neighborhood_engine->reset(neighbor); // let the neighborhood_engine start 
from the relaxed solution
    nextSolution = neighborhood_engine->next();
    if (nextSolution->improving(best))
    {
      best = nextSolution;
      current = nextSolution;
      intensity = initialIntensity;
      idle_iterations = 0;
    }
    else if (nextSolution->improving(current) || constrainStrategy == 
SimulatedAnnealing)
    {
      current = nextSolution;
      intensity = initialIntensity;
      idle_iterations = 0;
    }
    else
      idle_iterations++;
    if (idle_iterations > max_iterations_per_intensity)
      intensity = increaseIntensity;
  }
  return best;
}

> - what does it require from a space (you already mentioned two additional
> member functions)

Actually they grow up, although some of them have a tentative definition. The 
augmented interface of Space for being used in LNS is currently defined in the 
LNSModel class (in hybrid_model.hh).

There is another quite impacting requirement for a space to be used in the LNS 
engine: the post of branchers MUST be completely taken away from model posting 
(i.e., it should not be performed in the Space constructor) because LNS 
requires a fresh root model (free of branching) for cloning(3), since in the 
LNS overall strategy the branching for searching the initial solution is in 
principle different from the branching for neighborhood exploration. Since we 
do not have access to any method for removing branchers once they are posted we 
are forced to impose that branching must be posted through some functions.

> - what other parameters are needed to control the engine

Actually there are a bunch of parameters, you can find in LNSOptions class, 
namely: the range of variation of the intensity parameter (more or less related 
to the number of variables to be relaxed), the number of non improving 
iterations before intensity must be increased, the strategy for constraining 
the cost value in neighbors solutions (none, strictly or loosely improving over 
current, simulated annealing), for simulated annealing there are a few other 
specific parameters related to the temperature, cooling rate and the number of 
neighbors accepted before decreasing the temperature.

We will be happy if you can give a look at the engine and provide us with 
suggestion for integrating it. Consider that we also aim at contributing other 
hybrid metaheuristics that will blend with CP by driving the search (at least 
an ACO implementation and possibly some genetic algorithms), therefore these 
issues are quite general from this point of view.

Thanks and have a nice weekend (considered the organization issues of the CP 
conference :-)

Luca and Tommaso

----
Notes:
(1) the start engine, is actually needed because it will be not possible to 
change the options object (more precisely the stop object) passed to an engine 
after it has been created. Namely, we need two different stop objects for the 
first solution (which is the "global" meta-engine stop object) and for the 
neighbors (which is a timestop object).
(2) the actual LNS code, tough, is implemented differently in the engine 
because we wanted to comply with the "next()" semantics.
(3) we must confess that we are not completely happy of this solution, but it 
was the one with the least impact in the "jungle"
_______________________________________________
Gecode users mailing list
users@gecode.org
https://www.gecode.org/mailman/listinfo/gecode-users

Reply via email to