Hello,

I'm trying to build a .mod model with data from ten 25,000,000 row .csv files 
(approx 500,00Mb each).

Even building a model with one of them gives an insufficient memory error. 

Is there a better way to go about this than massive .csv files?

Does anyone have any experience with building models from such huge data sets?

cheers,
Andrew
_______________________________________________
Help-glpk mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/help-glpk

Reply via email to