Running MIP Models bigger than 250GBs
Posted: Fri Jan 18, 2019 2:29 pm
Hi Team,
We are trying to solve an MIP model with GAMS/CPLEX. The model returns a solution when we run it with small sets (less no. of variables). But in reality those sets can be very large and can have upto 3000-4000 elements. When we ran the model with the larger set (real case scenario), model generation time increased by a lot which should be reasonable to expect.
By improving the code we have been able to reduce the model size a bit but it is still considerable. When we run it, the model starts generating, it reaches a total size of 250GBs and ends with "Error: Out of Memory". We have tried with options such as memoryemphasis, workmem, solvelink = 0 etc.
My Question is:- Lets say if we increase the RAM by a large amount and give the model the processing power it needs (by using GPUs, TPUs etc.), basically run the model on a better hardware, Will it then be reasonable to expect a solution as the model returned solution when ran with less no. of variables? Or is it the other case that increasing the size of sets (variables) leads to not only an increase in computational requirement but also the complexity of solving the problem which may render the model unsolvable?
In short, Will investing in a better hardware pay-off? Is there any information available as to what is largest size of model GAMS/CPLEX has solved till date? (I understand that the complexity of the formulation also matters and not just the model size, but knowing the max. solved model size will help).
Thank you.
Regards,
-Utkarsh
We are trying to solve an MIP model with GAMS/CPLEX. The model returns a solution when we run it with small sets (less no. of variables). But in reality those sets can be very large and can have upto 3000-4000 elements. When we ran the model with the larger set (real case scenario), model generation time increased by a lot which should be reasonable to expect.
By improving the code we have been able to reduce the model size a bit but it is still considerable. When we run it, the model starts generating, it reaches a total size of 250GBs and ends with "Error: Out of Memory". We have tried with options such as memoryemphasis, workmem, solvelink = 0 etc.
My Question is:- Lets say if we increase the RAM by a large amount and give the model the processing power it needs (by using GPUs, TPUs etc.), basically run the model on a better hardware, Will it then be reasonable to expect a solution as the model returned solution when ran with less no. of variables? Or is it the other case that increasing the size of sets (variables) leads to not only an increase in computational requirement but also the complexity of solving the problem which may render the model unsolvable?
In short, Will investing in a better hardware pay-off? Is there any information available as to what is largest size of model GAMS/CPLEX has solved till date? (I understand that the complexity of the formulation also matters and not just the model size, but knowing the max. solved model size will help).
Thank you.
Regards,
-Utkarsh