Need explanation on the CNS solver

Solver related questions
Post Reply
Rodrigue
User
User
Posts: 33
Joined: 5 years ago

Need explanation on the CNS solver

Post by Rodrigue »

Hello,

I have a model that I would like to solve with the CNS solver. Indeed, I’ve gone through three solvers (CNS, MCP and NLP) for that model (see it in attachment). For my little experience on GAMS, I observed that with NLP solver, when the model works correctly, the process windows will show whether, “optimal solution. There are no superbasic variables.” Or “** Optimal solution. Reduced gradient less than tolerance.”. I’ve always thought that “** Optimal solution. Reduced gradient less than tolerance.” Means something went wrong but I don’t know why the status turns from “There are no superbasic variables” to “Reduced gradient less than tolerance.” (may someone tell me something about this?).

Concerning the CNS solver, I’ve been searching on internet how to correct the following error “** Error in Square System: Pivot too small.” Unfortunately, I’ve not got any interesting document. Paying attention to the equation bloc, one equation shows DEPND. That is “eq_theta” for AFR. The same issue comes at the variable “E” for AFR. However, when I change the solver to MCP, everything works correctly. With NLP, I rather obtained “** Optimal solution. Reduced gradient less than tolerance.”

Therefore, I highly think something is going wrong with my model. I want to know how to carry on the diagnostic.
Can someone give me any relevant reference on CNS that will explain me step by step (with example) how to build a CNS model? Also, for the NLP solver when the status turns from “There are no superbasic variables” to “Reduced gradient less than tolerance”. Besides, how can I understand the DEPND appearance?
MEP4.gms
(3.33 KiB) Downloaded 233 times
Rodrigue
User
User
Posts: 33
Joined: 5 years ago

Re: Need explanation on the CNS solver

Post by Rodrigue »

I saw where the problem relies on. Indeed, as a square system I was supposed to have the same number of independent equations as endogenous variables.

Therefore, I observed that equations 1&4 are both the same. So there are dependent. I had to remove on of them. When suppressing the equation 1 for example, It remains 3 independent equations and 4 independent variables. As the numéraire rule, I fixed the variable "theta" which corrected the problem not only for the CNS solver but also for the NLP problem.

Now I don't understand why the MCP tolerated this issue.
Post Reply