I write some mistake in the file table.math

The correct pseudo code is

set I; 
set J; 
param ik; 
param jk; 
param L; 
param U; 
param borne; 

param a{i in I, j in J}; 
var y{i in I, j in J} >=0; 
var x{i in I, j in J} >=0,integer; 

/* model */ 
minimize problem:sum{i in I, j in J} x[i,j]; 
s.t. x[ik,jk]=1; 
s.t. maximize contraint: y[ik,jk] >= a[ik,jk]+borne;  'It is a constraint 
which is the linear problem' Is it possible ? 
        s.t. Ligne{i in I}: sum{j in J} y[i,j] = a[i,0]; 
        s.t. Colonne{j in J}: sum{i in I} y[i,j] = a[0,j]; 
        s.t. Total: sum{i in I,j in J} y[i,j]=a[0,0]; 
        s.t. Borneinf{i in I, j in J}: y[i,j]>=a[i,j]-L*x[i,j]; 
        s.t. Bornesup{i in I, j in J}: y[i,j]<=a[i,j]+U*x[i,j]; 


end; 

Alexandre DEPIRE




I give you the pseudo-code of my problem. 
My goal is to find the optimal values for x. 

First the file table.data 
data; 

set I := 0 1 2 3 4; 
set J := 0 1 2 3; 

param a: 0 1 2 3 := 
0 71 31 22 18 
1 25 7 10 8 
2 26 12 9 5 
3 13 7 3 3 
4 7  5 0 2; 

param ik :=4; 
param jk :=3; 

param L:=0; 
param U:=71; 
param borne:=2; 
end; 

and the file table.math 

set I; 
set J; 
param ik; 
param jk; 
param L; 
param U; 
param borne; 

param a{i in I, j in J}; 
param x{i in I, j in J}; 

var y{i in I, j in J} >=0; 
var x{i in I, j in J} >=0,integer; 

/* model */ 
maximize problem:sum{i in I, j in J} x[i,j]; 
s.t. x[ik,jk]=1; 
        maximize contraint: y[ik,jk] >= a[ik,jk]+borne;     'It is a 
constraint which is the linear problem' Is it possible ? 
        s.t. Ligne{i in I}: sum{j in J} y[i,j] = a[i,0]; 
        s.t. Colonne{j in J}: sum{i in I} y[i,j] = a[0,j]; 
        s.t. Total: sum{i in I,j in J} y[i,j]=a[0,0]; 
        s.t. Borneinf{i in I, j in J}: y[i,j]>=a[i,j]-L*x[i,j]; 
        s.t. Bornesup{i in I, j in J}: y[i,j]<=a[i,j]+U*x[i,j]; 


end; 


I hope that i don't make mistakes in the code, and that the problem is 
well posed. 
Alexandre DEPIRE




> I would like to know if the following
> problem could be done with GLPK or any software (free) ?

> Let x(i,j) in {0,1} for i in I, and
> j in J, unknown variable
> Let y(i,j) in R, for i in I, j in J,
> unknown variable
> Let a(i,j), l(i,j) and u(i,j) known
> data
> Let K an subset of IxJ, K={(ik,jk)}
>  known

> Problem:

> Min sum( x( i , j ) )
>         subject
> to
>         $
> x( ik , jk )=1 
>               /* some x(i,j)=1, the
> others could be found */
>         $
> max    y( ik , jk ) >= a( ik , jk ) + U*x(ik,jk)
> 
>        subject to
> 
>       linear constraint on y(i,j)
> 
>       a(i,j)-l(i,j)*x(i,j) <= y(i,j) <= a(i,j)+u(i,j)*x(i,j)

You may write your model in GNU MathProg, a modeling language supported
by glpk, and then solve it with glpsol, the glpk stand-alone solver.
For more details please see the glpk documentation included in the
distribution tarball.

_______________________________________________
Help-glpk mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/help-glpk

_______________________________________________
Help-glpk mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/help-glpk

Reply via email to