We would like to run the DICE-2016R2 model in GAMS, for which the code is publicly available here:
http://www.econ.yale.edu/~nordhaus/home ... 1916ap.gms
Then, in order to make an assessment of the parameters in the model, we will do a sensitivity analysis. One thing we for example want to look at is:
- changing the parameter "prstp" for an entire range of values from 0.001 to 0.015, what values will that create for the main variables of interest:
Code: Select all
CPRICE(t)
Code: Select all
scc(t)
Can someone explain how to do this?
For simplification and convenience, one might want to refer to the the mean-variance model from the GAMS model library (attached as file) as example instead of the DICE model. In the last part of the code of that model we see several loops. Instead of using loops, we would like to create a gdx(?) file such that we can get a nice-looking table in excel as output, for our sensitivity analysis.
Basically we want to create a table with in the first column the analysed investment(s) 'i', there below its variance, its return, and finally it's lambda, and we want to display those values for various portfolios (portfolio 1 till 10), which should hence be displayed in the rows of the table. Preferably in excel.
I thought this could be done with the following code, but I cannot get it to work. Can someone explain how to do this?
Code: Select all
SummaryReport(*,*)
Code: Select all
SummaryReport (i,p)
Code: Select all
SummaryReport (variance,p)
Code: Select all
SummaryReport (return,p)
Code: Select all
SummaryReport (lambda,p)