I've been fooling around w/ L1 fits to a line w/ random errors in X & Y to collect performance profiling data. The model is cf12a.mod w/ the data generated by an awk script.
At slightly over 700 points, I find that the interior point method fails to converge sometimes. The number of failures in a run of 10 instances w/ 714 points varies from run to run. This is the case whether the data points vary from run to run or if the points are the same but the order in which they are supplied varies. command is: glpsol --interior -m tst.mod -d tst.dat can anyone point me to an explanation of why? I'm guessing this is a result of floating point arithmetic, but would like to better understand it. Thanks, Reg _______________________________________________ Help-glpk mailing list [email protected] https://lists.gnu.org/mailman/listinfo/help-glpk
