Outer-approximation (OA): Difference between revisions
Line 23: | Line 23: | ||
'''Step 1:''' Select an initial value of the binary variables <math display=inline>y^{1}</math>. Set the iteration counter <math display=inline>K=1</math>. Initialize the lower bound <math display=inline>Z_{L}=-\infty</math>, and the upper bound <math display=inline>Z_{U}=+\infty</math>. <br> | '''Step 1:''' Select an initial value of the binary variables <math display=inline>y^{1}</math>. Set the iteration counter <math display=inline>K=1</math>. Initialize the lower bound <math display=inline>Z_{L}=-\infty</math>, and the upper bound <math display=inline>Z_{U}=+\infty</math>. <br> | ||
'''Step 2:''' Solve the NLP subproblem for the fixed value <math display=inline>y^{k}</math>, to obtain the solution <math display=inline>x^{k}</math> and the multipliers <math display=inline>\lambda^{k}</math> for the equations <math display=inline>h(x)=0.</math> <br> | '''Step 2:''' Solve the NLP subproblem for the fixed value <math display=inline>y^{k}</math>, to obtain the solution <math display=inline>x^{k}</math> and the multipliers <math display=inline>\lambda^{k}</math> for the equations <math display=inline>h(x)=0.</math> <br> | ||
<math display=block> Z(y^{k})=min~~ c^{T}y^{k}+f(x)</math> | |||
<math display=block>s.t.~~~~~~~ h(x)=0</math> | |||
<math display=block>~~~~~~~g(x)\leq0</math> | |||
<math display=block>~~~~~~~h(x)\leq0</math> | |||
<math display=block>~~~~~~~Ax=a</math> | |||
<math display=block>~~~~~~~Cx\leq d-B~~y^{k}</math> | |||
<math display=block>~~~~~~~x \in X</math> | |||
'''Step 3:''' Update the bounds and prepare the information for the master problem:<br> | |||
# Update the current upper bound; if <math display=inline>Z(y^{k})<Z_{U}</math>, set <math display=inline> Z_{U}=Z(y^{k}), ~~y^{*}=y^{k},~~x^{*}=x^{k}.</math><br> | |||
# Derive the integer cut, <math display=inline>IC^{k}</math>, to make infeasible the choice of the binary <math display=inline>y^{k}</math> from the subsequent iterations: | |||
<math display=block>IC^{K}= \big\{ \sum_{ieB^{K}} y_{i}-\sum_{ieN^{K}} y \leq \mid B^{K}\mid-1 \big\} </math> | |||
<math display=block>where~~B^{k}=\big\{i\mid y_{i}^{K}=1\big\},~~ N^{k}=\big\{i\mid y_{i}^{K}=0\big\} </math> | |||
# Define the diagonal direction matrix <math display=inline>T^{K}</math> for relaxing the equations into inequalities based on the sign of the multipliers <math display=inline>\lambda^{k}</math>. The diagonal elements are given by: | |||
== Example == | == Example == |
Revision as of 08:05, 27 November 2021
Author: Yousef Aloufi (CHEME 6800 Fall 2021)
Introduction
Mixed-integer nonlinear programming (MINLP) deals with optimization problems by combining the modeling capabilities of mixed-integer linear programming (MILP) and nonlinear programming (NLP) into a powerful modeling framework. The integer variables make it possible to incorporate discrete decisions and logical relations in the optimization problems, and the combination of linear and nonlinear functions make it possible to accurately describe a variety of different phenomena. The ability to accurately model real-world problems has made MINLP an active research area and there exists a vast number of applications in fields such as engineering, computational chemistry, and finance. [1]
MINLP problems are usually the hardest to solve unless a special structure can be exploited. Consider the following MINLP formulation :
[2]
Minimize
Theory
The Outer-Approximation (OA) algorithm was first proposed by Duran and and Grossmann in 1986 to solve MINLP problems. The basic idea of the OA method is to solve a sequence of nonlinear programming sub-problems and relaxed mixed-integer linear master problem successfully. [4] In a minimization problem, for example, the NLP subproblems appear for a fixed number of binary variables, and involve the optimization of continuous variables with an upper bound to the original MINLP is obtained. The MILP master problem provides a global linear approximation to the MINLP in which the objective function is underestimated and the nonlinear feasible region is overestimated. In addition, the linear approximations to the nonlinear equations are relaxed as inequalities. This MILP master problem accumulates the different linear approximations of previous iterations so as to produce an increasingly better approximation of the original MINLP problem. At each iteration the master problem predicts new values of the binary variables and a lower bound to the objective function. Finally, the search is terminated when no lower bound can be found below the current best upper bound which then leads to an infeasible MILP. [2]
Algorithm
The specific steps of this algorithm, assuming feasible solutions for the NLP subproblems, are as follows:
Step 1: Select an initial value of the binary variables . Set the iteration counter . Initialize the lower bound , and the upper bound .
Step 2: Solve the NLP subproblem for the fixed value , to obtain the solution and the multipliers for the equations
Step 3: Update the bounds and prepare the information for the master problem:
- Update the current upper bound; if , set
- Derive the integer cut, , to make infeasible the choice of the binary from the subsequent iterations:
- Define the diagonal direction matrix for relaxing the equations into inequalities based on the sign of the multipliers . The diagonal elements are given by:
Example
Numerical Example
The following is a step-by-step solution for an MINLP optimization problem using Outer-Approximation method:[5]
Minimize
Step 1a: Start from and solve the NLP below:
Minimize
Step 1b: Solve the MILP master problem with OA for :
Minimize
MILP Solution: , Lower Bound = 6
Lower Bound < Upper Bound, Integer cut:
Step 2a: Start from
and solve the NLP below:
Minimize
Upper Bound = 6 = Lower Bound, Optimum!
Optimal Solution for the MINLP:
GAMS Model
The above MINLP example can be expressed in the General Algebraic Modeling System (GAMS) as follows:
Variable z;
Positive Variables x1, x2;
Binary Variables y1, y2;
Equations obj, c1, c2, c3, c4, c5, c6, c7;
obj.. z =e= y1 + y2 + sqr(x1) + sqr(x2);
c1.. sqr(x1 - 2) - x2 =l= 0;
c2.. x1 - 2*y1 =g= 0;
c3.. x1 - x2 - 3*sqr(1 - y1) =g= 0;
c4.. x1 + y1 - 1 =g= 0;
c5.. x2 - y2 =g= 0;
c6.. x1 + x2 =g= 3*y1;
c7.. y1 + y2 =g= 1;
x1.lo = 0; x1.up = 4;
x2.lo = 0; x2.up = 4;
model Example /all/;
option minlp = bonmin;
option optcr = 0;
solve Example minimizing z using minlp;
display z.l, x1.l, x2.l, y1.l, y2.l;
Conclusion
References
- ↑ Kronqvist J, Bernal D, Lundell A, Westerlund T (2018a) A center-cut algorithm for quickly obtaining feasible solutions and solving convex MINLP problems. Comput Chem Eng 122:105–113
- ↑ 2.0 2.1 Biegler L.T, Grossmann I.E, A.W. Westerberg, Systematic Methods of Chemical Process Design. Prentice Hall Press, 1997.
- ↑ Grossmann, I. E. Review of nonlinear mixed-integer and disjunctive programming techniques. Optimization and Engineering 2002, 3, 227–252.
- ↑ Duran MA, Grossmann I (1986) An outer-approximation algorithm for a class of mixed-integer nonlinear programs. Math. Programming 36(3):307–339.
- ↑ You, F. (2021). Lecture on Mixed Integer Non-Linear Programming (MINLP). Archives for CHEME 6800 Computational Optimization (2021FA), Cornell University, Ithaca, NY.