I solve timeseries NLPs of 6 million variables in GAMS using the IPOPTH solver. The same model base is used for various objective analyses. The model is discretized at a 1 hr interval over a 24 hour duration. There exist some optional temporal constraints that link between subsequent timepoints. However, there are analyses in which the temporal constraints are not needed, which results in independent per-timepoint models that are parallelizable. See a crude example below,
Code: Select all
minimize z_obj = sum(t, x(t))
x(t)^2 + y(t)^2 <= a(t)^2
x_min(t) <= x(t) <= x_max(t)
y_min(t) <= y(t) <= y_max(t)
where a(t), x_min(t), x_max(t), y_min(t) and y_max(t) are timeseries input parameters. I want to solve such model using the GUSS Grid environment by defining the timepoints as parallel scenarios. This gussgrid.gms (https://www.gams.com/latest/gamslib_ml/ ... sgrid.html
) serves as my reference while setting up the solve configuration. I realize that I'm not able to parallelize my model because the set of scenarios, in this case timepoint t, constitutes the index in equations, constraints and parameters that form the model. Is there a way to handle this kind of modelling situation to take advantage of GUSS Grid environment without significantly changing the model structure?