From Cornell University Computational Optimization Open Textbook - Optimization Wiki
|
|
| Line 7: |
Line 7: |
| == Example == | | == Example == |
| === Numerical Example === | | === Numerical Example === |
| The following is a step-by-step solution for an MINLP optimization problem using Outer-Approximation method:<ref>Content of the reference</ref> | | The following is a step-by-step solution for an MINLP optimization problem using Outer-Approximation method:<ref>Content of the reference</ref> <br> |
| ''Minimize'' <math display=block> f(x)= y_{1} +y_{2} + \big(x_{1}\big)^{2} +\big(x_{2}\big)^{2} </math> | | ''Minimize'' <math display=block> f(x)= y_{1} +y_{2} + \big(x_{1}\big)^{2} +\big(x_{2}\big)^{2} </math> |
| ''Subject to'' <math display=block>\big(x_{1}-2\big)^{2}-x_{2} \leq 0</math> | | ''Subject to'' <math display=block>\big(x_{1}-2\big)^{2}-x_{2} \leq 0</math> |
Revision as of 08:23, 26 November 2021
Author: Yousef Aloufi (CHEME 6800 Fall 2021)
Introduction
Theory
Example
Numerical Example
The following is a step-by-step solution for an MINLP optimization problem using Outer-Approximation method:[1]
Minimize

Subject to 









Solution
Step 1a: Start from

and solve the NLP below:
Minimize 
Subject to 







Solution: 
, Upper Bound = 7
Step 1b: Solve the MILP master problem with OA for
:
![{\displaystyle f{\big (}x{\big )}={\big (}x_{1}{\big )}^{2}+{\big (}x_{2}{\big )}^{2},~~\bigtriangledown f{\big (}x{\big )}=[2x_{1}~~~~2x_{1}]^{T}~~for~~x^{*}=[2~~~~1]^{T}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/3aa00d3f49988a196d93388e7ac41da58fc00066)
![{\displaystyle f{\big (}x^{*}{\big )}+\bigtriangledown f{\big (}x^{*}{\big )}^{T}{\big (}x-x^{*}{\big )}=5+[4~~~~2]{\begin{bmatrix}x_{1}-2\\x_{2}-1\end{bmatrix}}=5+4{\big (}x_{1}-2{\big )}+2{\big (}x_{2}-1{\big )}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/29222aefe55bdb63f346ae1a3c21c27fe7efa013)
![{\displaystyle g{\big (}x{\big )}={\big (}x_{1}-2{\big )}^{2}-x_{2},~~\bigtriangledown g{\big (}x{\big )}=[2x_{1}-4~~~~-1]^{T}~~for~~x^{*}=[2~~~~1]^{T}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f025d4d61d63efd6a87b01d8e7848abc520f2bfa)
![{\displaystyle g{\big (}x^{*}{\big )}+\bigtriangledown g{\big (}x^{*}{\big )}^{T}{\big (}x-x^{*}{\big )}=-1+[0~~~~-1]{\begin{bmatrix}x_{1}-2\\x_{2}-1\end{bmatrix}}=-x_{2}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/52bbf7d8d87d53306f2542ef0f6ff2b8a75ce4a9)
Minimize

Subject to 










MILP Solution:
, Lower Bound = 6
Lower Bound < Upper Bound, Integer cut:
Step 2a: Start from
and solve the NLP below:
Minimize

Subject to 







Solution: 
, Upper Bound = 6
Upper Bound = 6 = Lower Bound, Optimum!
Optimal Solution for the MINLP:
GAMS Model
The above MINLP example can be expressed in the General Algebraic Modeling System (GAMS) as follows:
Variable z;
Positive Variables x1, x2;
Binary Variables y1, y2;
Equations obj, c1, c2, c3, c4, c5, c6, c7;
obj.. z =e= y1 + y2 + sqr(x1) + sqr(x2);
c1.. sqr(x1 - 2) - x2 =l= 0;
c2.. x1 - 2*y1 =g= 0;
c3.. x1 - x2 - 3*sqr(1 - y1) =g= 0;
c4.. x1 + y1 - 1 =g= 0;
c5.. x2 - y2 =g= 0;
c6.. x1 + x2 =g= 3*y1;
c7.. y1 + y2 =g= 1;
x1.lo = 0; x1.up = 4;
x2.lo = 0; x2.up = 4;
model Example /all/;
option minlp = bonmin;
option optcr = 0;
solve Example minimizing z using minlp;
display z.l, x1.l, x2.l, y1.l, y2.l;
Conclusion
References
Template:Reflist
- ↑ Content of the reference