#11606: simplify constraints in linear programs
----------------------------------+-----------------------------------------
   Reporter:  john_perry          |          Owner:  ncohen    
       Type:  enhancement         |         Status:  needs_info
   Priority:  major               |      Milestone:  sage-4.7.2
  Component:  linear programming  |       Keywords:  sd32      
Work_issues:                      |       Upstream:  N/A       
   Reviewer:                      |         Author:  john_perry
     Merged:                      |   Dependencies:            
----------------------------------+-----------------------------------------

Comment(by john_perry):

 Replying to [comment:5 ncohen]:
 > I was thinking again about what I told you by email : in order to
 simplify constraints more efficiently, one could add the constraints to a
 Set object when adding it to the solvers, and add a new constraint only
 when it is not already included in the set. The search is then much more
 efficient, and handled by Python. If you also want to test for constraints
 which may be formaly different (i.e. the original constraint multiplied by
 two), you but have to insert the "normalized" constraint in the set
 instead of the one which is given.


 I have a working implementation along these lines, but there's one
 difficulty. In order to normalize the constraint, the code divides by the
 coefficient of the variable with the smallest index. For example, given
 the constraint `2*x_1-2*x_0<=2`, the code converts this to `x_1-x_0<=1`.

 In order to do this, I want to use the `min()` command to determine the
 smallest non-zero key, determine its coefficient, and divide:
 {{{
     i = min(f.keys())
     c = f[i]
     C = [(v,coeff/c) for (v,coeff) in f.iteritems() if v != -1]
 }}}
 The problem with this is that `min` is a parameter to the function, so
 this provokes an error.

 I can rename the parameters as `min_value` and `max_value`, but this would
 break people's existing code. Alternately, I can define `self._min = min`
 in `__init__`, and call `self._min()` where I would ordinarily call
 `min()`.

 I am attaching a patch that uses the latter approach, but if you have
 suggestions for a better solution, I'm all ears. It wouldn't surprise me
 if there is one.

 BTW, does it matter if I use `Set` or `set`? You wrote the former, but
 currently I'm using the latter.

-- 
Ticket URL: <http://trac.sagemath.org/sage_trac/ticket/11606#comment:9>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica, 
and MATLAB

-- 
You received this message because you are subscribed to the Google Groups 
"sage-trac" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/sage-trac?hl=en.

Reply via email to