> I am trying to solve a network problem where I have sources and sinks
> only and the decision variables determine how the the flow from sources
> is partitioned over the sinks. There are about 700K sources and 1K
> sinks.  There are about 55 million non-zero variables.  There are
> capacity constraints on the sources and requirements on the sinks
> resulting in about 700K  constraints. 
> 
> I created a MathProg file describing the problem. I was not able to
> process the file using 8GB of RAM.  GLPSOL v4.44  died while generating
> caps because it ran out of memory.

Your model is too large to be processed by the MathProg translator
(it is just not designed for such large models). Besides, if you used 
a 32-bit glpk version, at most 2GB of ram could be actually used.

> 
> I will need to reformulate the problem at hand, but the reformulation
> will still have a  problem size with over a million variables.  
> 
> Nevertheless, I have the following questions:
> 
> a) Is there some formula to predict the amount of memory needed to solve
> a problem for GLPSOL?

Please see:
http://lists.gnu.org/archive/html/help-glpk/2008-07/msg00044.html
for some calculations about the memory requirements.

> 
> b) I plan to use the C API functions to solve a problem rather than
> using a MathProg file.  Do the memory requirements change depending on
> the way a problem is uploaded to GLPSOL?   

Yes, using glpk api will significantly reduce the memory requirements.
However, if your model is pure mincost, you might use the glpk network
routines and solve your model with the out-of-kilter algorithm, which is
about ten times more efficient than the simplex method. See the glpk
documentation for details.


_______________________________________________
Help-glpk mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/help-glpk

Reply via email to