From Cornell University Computational Optimization Open Textbook - Optimization Wiki
|
|
Line 34: |
Line 34: |
| <math display=block>f\big(x\big) =\big( x_{1} \big)^{2} +\big( x_{2} \big)^{2},~~ \bigtriangledown f\big(x\big)=[2x_{1}~~~~2x_{1}]^{T} ~~for~~x^{*} =[2~~~~1]^{T} </math> | | <math display=block>f\big(x\big) =\big( x_{1} \big)^{2} +\big( x_{2} \big)^{2},~~ \bigtriangledown f\big(x\big)=[2x_{1}~~~~2x_{1}]^{T} ~~for~~x^{*} =[2~~~~1]^{T} </math> |
| <math display=block>f\big(x^{*}\big)+ \bigtriangledown f\big(x^{*}\big)^{T}\big(x-x^{*}\big)=5+[4~~~~2] \begin{bmatrix}x_{1}-2 \\x_{2}-1 \end{bmatrix}=5+4\big(x_{1}-2\big)+2\big(x_{2}-1\big)</math> | | <math display=block>f\big(x^{*}\big)+ \bigtriangledown f\big(x^{*}\big)^{T}\big(x-x^{*}\big)=5+[4~~~~2] \begin{bmatrix}x_{1}-2 \\x_{2}-1 \end{bmatrix}=5+4\big(x_{1}-2\big)+2\big(x_{2}-1\big)</math> |
| | <br> |
| <math display=block>g\big(x\big)=\big(x_{1}-2\big)^{2}-x_{2},~~ \bigtriangledown g\big(x\big)=[2x_{1}-4~~~~-1]^{T}~~for~~x^{*} =[2~~~~1]^{T} </math> | | <math display=block>g\big(x\big)=\big(x_{1}-2\big)^{2}-x_{2},~~ \bigtriangledown g\big(x\big)=[2x_{1}-4~~~~-1]^{T}~~for~~x^{*} =[2~~~~1]^{T} </math> |
|
| |
|
Revision as of 06:23, 26 November 2021
Author: Yousef Aloufi (CHEME 6800 Fall 2021)
Introduction
Theory
Example
Minimize
Subject to
Solution
Step 1a: Start from
and solve the NLP below:
Minimize
Subject to
Solution: , Upper Bound = 7
Step 1a: Solve the MILP master problem with OA for :
Minimize
Subject to
Conclusion
References