Outer-approximation (OA): Difference between revisions

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search
Line 33: Line 33:
'''Step 1a:'''  Solve the MILP master problem with OA for <math display=inline> x^{*} =[2,1] </math> : <br>
'''Step 1a:'''  Solve the MILP master problem with OA for <math display=inline> x^{*} =[2,1] </math> : <br>
<math display=block>f\big(x\big) =\big( x_{1} \big)^{2} +\big( x_{2} \big)^{2},~~ \bigtriangledown  f\big(x\big)=[2x_{1}~~2x_{1}]^{T} ~~for~~x^{*} =[2,1]^{T} </math>
<math display=block>f\big(x\big) =\big( x_{1} \big)^{2} +\big( x_{2} \big)^{2},~~ \bigtriangledown  f\big(x\big)=[2x_{1}~~2x_{1}]^{T} ~~for~~x^{*} =[2,1]^{T} </math>
<math display=block>f\big(x^{*}\big)+ \bigtriangledown  f\big(x^{*}\big)^{T}\big(x-x^{*}\big)=5+[4~~2] \begin{bmatrix}x_{1}-2  \\x_{2}-1  \end{bmatrix} </math>
<math display=block>f\big(x^{*}\big)+ \bigtriangledown  f\big(x^{*}\big)^{T}\big(x-x^{*}\big)=5+[4~~2] \begin{bmatrix}x_{1}-2  \\x_{2}-1  \end{bmatrix}=5+4\big(x_{1}-2\big)+2\big(x_{2}-1\big)</math>


==Conclusion==
==Conclusion==


==References==
==References==

Revision as of 07:06, 26 November 2021

Author: Yousef Aloufi (CHEME 6800 Fall 2021)

Introduction

Theory

Example

Minimize

Subject to
Solution
Step 1a: Start from and solve the NLP below:
Minimize
Subject to
Solution: , Upper Bound = 7

Step 1a: Solve the MILP master problem with OA for  :

Conclusion

References