https://optimization.cbe.cornell.edu/api.php?action=feedcontributions&user=JonBoisvert&feedformat=atomCornell University Computational Optimization Open Textbook - Optimization Wiki - User contributions [en]2021-07-31T04:57:10ZUser contributionsMediaWiki 1.35.0https://optimization.cbe.cornell.edu/index.php?title=Branch_and_cut&diff=2548Branch and cut2020-12-14T00:33:12Z<p>JonBoisvert: /* Step 2. Terminate: */</p>
<hr />
<div>Author: Lindsay Siegmundt, Peter Haddad, Chris Babbington, Jon Boisvert, Haris Shaikh (SysEn 6800 Fall 2020)<br />
<br />
Steward: Wei-Han Chen, Fengqi You<br />
<br />
== Introduction ==<br />
The Branch and Cut methodology was discovered in the 90s as a way to solve/optimize Mixed-Integer Linear Programs (Karamanov, Miroslav). This concept is comprised of two known optimization methodologies - Branch and Bound and Cutting Planes. Utilizing these two tools allows for the Branch and Cut to find an optimal solution through relaxing the problem to produce the upper bound. Relaxing the problem allows for the complex problem to be simplified in order for it to be solve more easily. Furthermore, the upper bound represents the highest value the objective can take in order to be feasible. The optimal solution is found when the objective is equal to the upper bound (Luedtke, Jim). This methodology is critical to the future of optimization since it combines two common tools in order to utilize each component in order to find the optimal solution. Moving forward, the critical components of different methodologies could be combined in order to find optimality in a more simple and direct manner. <br />
<br />
== Methodology & Algorithm ==<br />
<br />
=== Methodology ===<br />
{| class="wikitable"<br />
|+Abbreviation Details<br />
!Acronym<br />
!Expansion<br />
|-<br />
|LP<br />
|Linear Programming<br />
|-<br />
|B&B<br />
|Branch and Bound<br />
|}<br />
<br />
==== Most Infeasible Branching: ====<br />
Most infeasible branching is a very popular method that picks the variable with fractional part closest to <math>0:5</math>, i.e.,<math> si = 0:5-|xA_i- xA_i-0:5|</math>. Most infeasible branching picks a variable where the least tendency can be recognized to which side the variable should be rounded. However, the performance of this method is not any superior to the rule of selecting a variable randomly.<br />
<br />
==== '''Strong Branching:''' ====<br />
For each fractional variable, strong branching tests the dual bound increase by computing the LP relaxations result from the branching on that variable. As a branching variable for the current for the current node, the variable that leads to the largest increases is selected. Despite its obvious simplicity, strong branching is so far the most powerful branching technique in terms of the number of nodes available in the B&B tree, this effectiveness can however be accomplished only at the cost of computation.<br />
<br />
==== '''Pseudo Cost:''' ====<br />
[[File:Image.png|thumb|Pure psuedo cost branching]]<br />
Another way to approximate a relaxation value is by utilizing a pseudo cost method. The pseudo-cost of a variable is an estimate of the per unit change in the objective function from making the value of the variable to be rounded up or down. For each variable we choose variable with the largest estimated LP objective gain. <br />
==='''Algorithm'''===<br />
Branch and Cut for is a variation of the Branch and Bound algorithm. Branch and Cut incorporates Gomery cuts allowing the search space of the given problem. The standard Simplex Algorithm will be used to solve each Integer Linear Programming Problem (LP).<br />
<br />
<br />
<math>min: c^tx<br />
</math><br />
<br />
<math>s.t. Ax < b<br />
</math><br />
<br />
<math>x \geq 0<br />
</math><br />
<br />
<math>x_i = int, i = 1,2,3...,n<br />
</math><br />
<br />
Above is a mix-integer linear programming problem. x and c are a part of the n-vector. These variables can be set to 0 or 1 allow binary variables. The above problem can be denoted as <math>LP_n </math><br />
<br />
Below is an Algorithm to utilize the Branch and Cut algorithm with Gomery cuts and Partitioning:<br />
<br />
'''Step 0:'''<br />
Upper Bound = ∞<br />
Lower Bound = -∞<br />
'''Step 1. Initialize:'''<br />
<br />
Set the first node as <math>LP_0</math> while setting the active nodes set as <math>L</math>. The set can be accessed via <math>LP_n </math><br />
<br />
===='''Step 2. Terminate:'''====<br />
Step 3. Iterate through list L:<br />
<br />
While <math>L</math> is not empty (i is the index of the list of L), then:<br />
<br />
'''Step 3.1. Convert to a Relaxation:'''<br />
<br />
'''Solve 3.2.'''<br />
<br />
Solve for the Relaxed<br />
<br />
'''Step 3.3.'''<br />
If Z is infeasible:<br />
Return to step 3.<br />
else:<br />
Continue with solution Z.<br />
'''Step 4. Cutting Planes:'''<br />
If a cutting plane is found:<br />
then add to the Linear Relaxation problem (as a constraint) and return to step 3.2<br />
Else:<br />
Continue.<br />
'''Step 5. Pruning and Fathoming:'''<br />
<br />
(a)If ≥ Z:, then go to step 3.<br />
If Z^l <= Z AND X_i is an integral feasible:<br />
Z = Z^i<br />
Remove all Z^i from Set(L)<br />
'''Step 6. Partition'''<br />
<br />
Let <math>D^{lj=k}_{j=1}</math> be a partition of the constraint set <math>D</math> of problem <math>LP_l</math>. Add problems <math>D^{lj=k}_{j=1}</math> to L, where <math>LP^l_j</math> is <math>LP_l</math> with feasible region restricted to <math>D^l_j</math> and <math>Z_{lj}</math> for j=1,...k is set to the value of <math>Z^l</math> for the parent problem l. Go to step 3.<br />
<br />
==Numerical Example==<br />
First, list out the MILP:<br />
<br />
<math>min \ z=-4x_1-7x_2</math><br />
<br />
<math>6x_1 + x_2 \leq13</math><br />
<br />
<math>-x_1+4x_2\leq5</math><br />
<br />
<math>x_1,x_2\geq0</math><br />
<br />
Solution to original LP<br />
<br />
<math>z =-19.56, x_1=1.88, x_2=1.72 </math><br />
<br />
<br />
Branch on x_1 to generate sub-problems<br />
<br />
<math>min \ z=-4x_1-7x_2</math><br />
<br />
<math>6x_1 + x_2 \leq13</math><br />
<br />
<math>-x_1+4x_2\leq5</math><br />
<br />
<math>x_1\geq2</math><br />
<br />
<math>x_1,x_2\geq0</math><br />
<br />
Solution to fist branch sub-problem<br />
<br />
<math>z =-15, x_1=2, x_2=1</math><br />
<br />
<math>min \ z=-4x_1-7x_2</math><br />
<br />
<math>6x_1 + x_2 \leq13</math><br />
<br />
<math>-x_1+4x_2\leq5</math><br />
<br />
<math>x_1\leq1</math><br />
<br />
<math>x_1,x_2\geq0</math><br />
<br />
Solution to second branch sub-problem<br />
<br />
<math>z =-14.5, x_1=1, x_2=1.5</math><br />
<br />
Adding a cut<br />
<br />
<math>min \ z=-4x_1-7x_2</math><br />
<br />
<math>6x_1 + x_2 \leq13</math><br />
<br />
<math>-x_1+4x_2\leq5</math><br />
<br />
<math>2x_1+x_2\leq 3</math><br />
<br />
<math>x_1\leq1</math><br />
<br />
<math>x_1,x_2\geq0</math><br />
<br />
Solution to cut LP<br />
<br />
<math>z=-13.222,x_1=.778,x_2=1.444</math><br />
<br />
==Application==<br />
Several of the Branch and Cut applications are described below in more detail and how they can be used. These applications serve as methods in which Branch and Cut can be used to optimize various problems efficiently.<br />
<br />
=== '''Combinatorial Optimization''' ===<br />
Combinatorial Optimization is a great application for Branch and Cut. This style of optimization is the methodology of utilizing the finite known sets and information of the sets to optimize the solution. The original intent for this application was for maximizing flow as well as in the transportation industry (Maltby and Ross). This combinatorial optimization has also taken on some new areas where it is used often. Combinatorial Optimization is now an imperative component in studying artificial intelligence and machine learning algorithms to optimize solutions. The finite sets that Combinatorial Optimization tends to utilize and focus on includes graphs, partially ordered sets, and structures that define linear independence call matroids.<br />
<br />
=== '''Bender’s Decomposition''' ===<br />
Bender’s Decomposition is another Branch and Cut application that is utilized widely in Stochastic Programming. Bender’s Decomposition is where you take the initial problem and divide into two distinct subsets. By dividing the problem into two separate problems you are able to solve each set easier than the original instance (Benders). Therefore the first problem within the subset created can be solved for the first variable set. The second sub problem is then solved for, given that first problem solution. Doing this allows for the sub problem to be solved to determine whether the first problem is infeasible (Benders). Bender’s cuts can be added to constrain the problem until a feasible solution can be found.<br />
<br />
=== '''Large-Scale Symmetric Traveling Salesmen Problem''' ===<br />
The Large-Scale Symmetric Traveling Salesmen Problem is a common problem that was always looked into optimizing for the shortest route while visiting each city once and returning to the original city at the end. On a larger scale this style of problem must be broken down into subsets or nodes (SIAM). By constraining this style of problem such as the methods of Combinatorial Optimization, the Traveling Salesmen Problem can be viewed as partially ordered sets. By doing this on a large scale with finite cities you are able to optimize the shortest path taken and ensure each city is only visited once.<br />
<br />
=== '''Submodular Function''' ===<br />
Submodular Function is another function in which is used throughout artificial intelligence as well as machine learning. The reason for this is because as inputs are increased into the function the value or outputs decrease. This allows for a great optimization features in the cases stated above because inputs are continually growing. This allows for machine learning and artificial intelligence to continue to grow based on these algorithms (Tschiatschek, Iyer, and Bilmes). By enforcing new inputs to the system the system will learn more and more to ensure it optimizes the solution that is to be made.<br />
<br />
==Conclusion==<br />
The Branch and Cut is an optimization algorithm used to optimize integer linear programming. It combines two other optimization algorithms - branch and bound and cutting planes in order to utilize the results from each method in order to create the most optimal solution. There are three different methodologies used within the specific method - most infeasible branching, strong branching, and pseudo code. Furthermore, Branch and Cut can be utilized it multiple scenarios - Submodular function, large-scale symmetric traveling salesmen problem, bender's decomposition, and combination optimization which increases the impact of the methodology. <br />
<br />
==References==<br />
<br />
# A. Krause and C. Guestrin, Beyond Convexity: Submodularity in Machine Learning, Tutorial at ICML-2008<br />
# Benders, J. F. (Sept. 1962), "Partitioning procedures for solving mixed-variables programming problems", Numerische Mathematik 4(3): 238–252.<br />
# Karamanov, Miroslav. “Branch and Cut: An Empirical Study.” ''Carnegie Mellon University'' , Sept. 2006, https://www.cmu.edu/tepper/programs/phd/program/assets/dissertations/2006-operations-research-karamanov-dissertation.pdf.<br />
# Luedtke, Jim. “The Branch-and-Cut Algorithm for Solving Mixed-Integer Optimization Problems.” ''Institute for Mathematicians and Its Applications'', 10 Aug. 2016, https://www.ima.umn.edu/materials/2015-2016/ND8.1-12.16/25397/Luedtke-mip-bnc-forms.pdf.<br />
# Maltby, Henry, and Eli Ross. “Combinatorial Optimization.” ''Brilliant Math & Science Wiki'', https://brilliant.org/wiki/combinatorial-optimization/.<br />
# Society for Industrial and Applied Mathematics. “SIAM Rev.” ''SIAM Review'', 18 July 2006, https://epubs.siam.org/doi/10.1137/1033004.<br />
# S. Tschiatschek, R. Iyer, H. Wei and J. Bilmes, Learning Mixtures of Submodular Functions for Image Collection Summarization, NIPS-2014.<br />
#</div>JonBoisverthttps://optimization.cbe.cornell.edu/index.php?title=Branch_and_cut&diff=2287Branch and cut2020-12-11T02:36:01Z<p>JonBoisvert: /* Combinatorial Optimization */</p>
<hr />
<div>Author: Lindsay Siegmundt, Peter Haddad, Chris Babbington, Jon Boisvert, Haris Shaikh (SysEn 6800 Fall 2020)<br />
<br />
Steward: Wei-Han Chen, Fengqi You<br />
<br />
== Introduction ==<br />
The Branch and Cut methodology was discovered in the 90s as a way to solve/optimize Mixed-Integer Linear Programs (Karamanov, Miroslav). This concept is comprised of two known optimization methodologies - Branch and Bound and Cutting Planes. Utilizing these two tools allows for the Branch and Cut to find an optimal solution through relaxing the problem to produce the upper bound. Relaxing the problem allows for the complex problem to be simplified in order for it to be solve more easily. Furthermore, the upper bound represents the highest value the objective can take in order to be feasible. The optimal solution is found when the objective is equal to the upper bound (Luedtke, Jim). This methodology is critical to the future of optimization since it combines two common tools in order to utilize each component in order to find the optimal solution. Moving forward, the critical components of different methodologies could be combined in order to find optimality in a more simple and direct manner. <br />
<br />
== Methodology & Algorithm ==<br />
<br />
=== Methodology ===<br />
{| class="wikitable"<br />
|+Abbreviation Details<br />
!Acronym<br />
!Expansion<br />
|-<br />
|LP<br />
|Linear Programming<br />
|-<br />
|B&B<br />
|Branch and Bound<br />
|}<br />
<br />
==== Most Infeasible Branching: ====<br />
Most infeasible branching is a very popular method that picks the variable with fractional part closest to <math>0:5</math>, i.e.,<math> si = 0:5-|xA_i- xA_i-0:5|</math>. Most infeasible branching picks a variable where the least tendency can be recognized to which side the variable should be rounded. However, the performance of this method is not any superior to the rule of selecting a variable randomly.<br />
<br />
==== '''Strong Branching:''' ====<br />
For each fractional variable, strong branching tests the dual bound increase by computing the LP relaxations result from the branching on that variable. As a branching variable for the current for the current node, the variable that leads to the largest increases is selected. Despite its obvious simplicity, strong branching is so far the most powerful branching technique in terms of the number of nodes available in the B&B tree, this effectiveness can however be accomplished only at the cost of computation.<br />
<br />
==== '''Pseudo Cost:''' ====<br />
[[File:Image.png|thumb|Pure psuedo cost branching]]<br />
Another way to approximate a relaxation value is by utilizing a pseudo cost method. The pseudo-cost of a variable is an estimate of the per unit change in the objective function from making the value of the variable to be rounded up or down. For each variable we choose variable with the largest estimated LP objective gain. <br />
==='''Algorithm'''===<br />
Branch and Cut for is a variation of the Branch and Bound algorithm. Branch and Cut incorporates Gomery cuts allowing the search space of the given problem. The standard Simplex Algorithm will be used to solve each Integer Linear Programming Problem (LP).<br />
<br />
<br />
<math>min: c^tx<br />
</math><br />
<br />
<math>s.t. Ax < b<br />
</math><br />
<br />
<math>x \geq 0<br />
</math><br />
<br />
<math>x_i = int, i = 1,2,3...,n<br />
</math><br />
<br />
Above is a mix-integer linear programming problem. x and c are a part of the n-vector. These variables can be set to 0 or 1 allow binary variables. The above problem can be denoted as <math>LP_n </math><br />
<br />
Below is an Algorithm to utilize the Branch and Cut algorithm with Gomery cuts and Partitioning:<br />
<br />
'''Step 0:'''<br />
Upper Bound = ∞<br />
Lower Bound = -∞<br />
'''Step 1. Initialize:'''<br />
<br />
Set the first node as <math>LP_0</math> while setting the active nodes set as <math>L</math>. The set can be accessed via <math>LP_n </math><br />
<br />
===='''Step 2. Terminate:'''====<br />
Step 3. Iterate through list L:<br />
<br />
While <math>L</math> is not empty (i is the index of the list of L), then:<br />
<br />
'''Step 3.1. Convert to a Relaxation:'''<br />
<br />
'''Solve 3.2.'''<br />
<br />
Solve for the Relaxed<br />
<br />
'''Step 3.3.'''<br />
If Z is infeasible:<br />
Return to step 3.<br />
else:<br />
Continue with solution Z.<br />
'''Step 4. Cutting Planes:'''<br />
If a cutting plane is found:<br />
then add to the linear Relaxation problem (as a constraint) and return to step 3.2<br />
Else:<br />
Continue.<br />
'''Step 5. Pruning and Fathoming:'''<br />
<br />
(a)If ≥ Z:, then go to step 3.<br />
If Z^l <= Z AND X_i is an integral feasible:<br />
Z = Z^i<br />
Remove all Z^i from Set(L)<br />
'''Step 6. Partition'''<br />
<br />
Let <math>D^{lj=k}_{j=1}</math> be a partition of the constraint set <math>D</math> of problem <math>LP_l</math>. Add problems <math>D^{lj=k}_{j=1}</math> to L, where <math>LP^l_j</math> is <math>LP_l</math> with feasible region restricted to <math>D^l_j</math> and <math>Z_{lj}</math> for j=1,...k is set to the value of <math>Z^l</math> for the parent problem l. Go to step 3.<br />
<br />
==Numerical Example==<br />
First, list out the problem:<br />
<br />
min z = -8x1 - 7x2 <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
x<sub>1</sub>, x<sub>2</sub> ≥ 0 <br />
<br />
First Solution of MILP = Min = -32.625 (x<sub>1</sub> = 2.875, x<sub>2</sub> = 1.3750) <br />
<br />
Branch: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub> <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Second Solution of MILP = Min = -24.4 (x1 = 2, x2 = 1.2) <br />
<br />
Branch: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub> <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Add Cut: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub><br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
2x<sub>1</sub> ≤ 6 (Cut Dividing equation 1 by 2)<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Solution with cut = Min = -24.4 (x<sub>1</sub> = 2, x<sub>2</sub> = 1.2)<br />
<br />
==Application==<br />
Several of the Branch and Cut applications are described below in more detail and how they can be used. These applications serve as methods in which Branch and Cut can be used to optimize various problems efficiently.<br />
<br />
=== '''Combinatorial Optimization''' ===<br />
Combinatorial Optimization is a great application for Branch and Cut. This style of optimization is the methodology of utilizing the finite known sets and information of the sets to optimize the solution. The original intent for this application was for maximizing flow as well as in the transportation industry (Maltby and Ross). This combinatorial optimization has also taken on some new areas where it is used often. Combinatorial Optimization is now an imperative component in studying artificial intelligence and machine learning algorithms to optimize solutions. The finite sets that Combinatorial Optimization tends to utilize and focus on includes graphs, partially ordered sets, and structures that define linear independence call matroids.<br />
<br />
=== '''Bender’s Decomposition''' ===<br />
Bender’s Decomposition is another Branch and Cut application that is utilized widely in Stochastic Programming. Bender’s Decomposition is where you take the initial problem and divide into two distinct subsets. By dividing the problem into two separate problems you are able to solve each set easier than the original instance (Benders). Therefore the first problem within the subset created can be solved for the first variable set. The second sub problem is then solved for, given that first problem solution. Doing this allows for the sub problem to be solved to determine whether the first problem is infeasible (Benders). Bender’s cuts can be added to constrain the problem until a feasible solution can be found.<br />
<br />
=== '''Large-Scale Symmetric Traveling Salesmen Problem''' ===<br />
The Large-Scale Symmetric Traveling Salesmen Problem is a common problem that was always looked into optimizing for the shortest route while visiting each city once and returning to the original city at the end. On a larger scale this style of problem must be broken down into subsets or nodes (SIAM). By constraining this style of problem such as the methods of Combinatorial Optimization, the Traveling Salesmen Problem can be viewed as partially ordered sets. By doing this on a large scale with finite cities you are able to optimize the shortest path taken and ensure each city is only visited once.<br />
<br />
=== '''Submodular Function''' ===<br />
Submodular Function is another function in which is used throughout artificial intelligence as well as machine learning. The reason for this is because as inputs are increased into the function the value or outputs decrease. This allows for a great optimization features in the cases stated above because inputs are continually growing. This allows for machine learning and artificial intelligence to continue to grow based on these algorithms (Tschiatschek, Iyer, and Bilmes). By enforcing new inputs to the system the system will learn more and more to ensure it optimizes the solution that is to be made.<br />
<br />
==Conclusion==<br />
The Branch and Cut is an optimization algorithm used to optimize integer linear programming. It combines two other optimization algorithms - branch and bound and cutting planes in order to utilize the results from each method in order to create the most optimal solution. There are three different methodologies used within the specific method - most infeasible branching, strong branching, and pseudo code. Furthermore, Branch and Cut can be utilized it multiple scenarios - Submodular function, large-scale symmetric traveling salesmen problem, bender's decomposition, and combination optimization which increases the impact of the methodology. <br />
<br />
==References==<br />
<br />
# A. Krause and C. Guestrin, Beyond Convexity: Submodularity in Machine Learning, Tutorial at ICML-2008<br />
# Benders, J. F. (Sept. 1962), "Partitioning procedures for solving mixed-variables programming problems", Numerische Mathematik 4(3): 238–252.<br />
# Karamanov, Miroslav. “Branch and Cut: An Empirical Study.” ''Carnegie Mellon University'' , Sept. 2006, https://www.cmu.edu/tepper/programs/phd/program/assets/dissertations/2006-operations-research-karamanov-dissertation.pdf.<br />
# Luedtke, Jim. “The Branch-and-Cut Algorithm for Solving Mixed-Integer Optimization Problems.” ''Institute for Mathematicians and Its Applications'', 10 Aug. 2016, https://www.ima.umn.edu/materials/2015-2016/ND8.1-12.16/25397/Luedtke-mip-bnc-forms.pdf.<br />
# Maltby, Henry, and Eli Ross. “Combinatorial Optimization.” ''Brilliant Math & Science Wiki'', https://brilliant.org/wiki/combinatorial-optimization/.<br />
# Society for Industrial and Applied Mathematics. “SIAM Rev.” ''SIAM Review'', 18 July 2006, https://epubs.siam.org/doi/10.1137/1033004.<br />
# S. Tschiatschek, R. Iyer, H. Wei and J. Bilmes, Learning Mixtures of Submodular Functions for Image Collection Summarization, NIPS-2014.<br />
#</div>JonBoisverthttps://optimization.cbe.cornell.edu/index.php?title=Branch_and_cut&diff=2284Branch and cut2020-12-11T02:29:39Z<p>JonBoisvert: /* Large-Scale Symmetric Traveling Salesmen Problem */</p>
<hr />
<div>Author: Lindsay Siegmundt, Peter Haddad, Chris Babbington, Jon Boisvert, Haris Shaikh (SysEn 6800 Fall 2020)<br />
<br />
Steward: Wei-Han Chen, Fengqi You<br />
<br />
== Introduction ==<br />
The Branch and Cut methodology was discovered in the 90s as a way to solve/optimize Mixed-Integer Linear Programs (Karamanov, Miroslav). This concept is comprised of two known optimization methodologies - Branch and Bound and Cutting Planes. Utilizing these two tools allows for the Branch and Cut to find an optimal solution through relaxing the problem to produce the upper bound. Relaxing the problem allows for the complex problem to be simplified in order for it to be solve more easily. Furthermore, the upper bound represents the highest value the objective can take in order to be feasible. The optimal solution is found when the objective is equal to the upper bound (Luedtke, Jim). This methodology is critical to the future of optimization since it combines two common tools in order to utilize each component in order to find the optimal solution. Moving forward, the critical components of different methodologies could be combined in order to find optimality in a more simple and direct manner. <br />
<br />
== Methodology & Algorithm ==<br />
<br />
=== Methodology ===<br />
{| class="wikitable"<br />
|+Abbreviation Details<br />
!Acronym<br />
!Expansion<br />
|-<br />
|LP<br />
|Linear Programming<br />
|-<br />
|B&B<br />
|Branch and Bound<br />
|}<br />
<br />
==== Most Infeasible Branching: ====<br />
Most infeasible branching is a very popular method that picks the variable with fractional part closest to <math>0:5</math>, i.e.,<math> si = 0:5-|xA_i- xA_i-0:5|</math>. Most infeasible branching picks a variable where the least tendency can be recognized to which side the variable should be rounded. However, the performance of this method is not any superior to the rule of selecting a variable randomly.<br />
<br />
==== '''Strong Branching:''' ====<br />
For each fractional variable, strong branching tests the dual bound increase by computing the LP relaxations result from the branching on that variable. As a branching variable for the current for the current node, the variable that leads to the largest increases is selected. Despite its obvious simplicity, strong branching is so far the most powerful branching technique in terms of the number of nodes available in the B&B tree, this effectiveness can however be accomplished only at the cost of computation.<br />
<br />
==== '''Pseudo Cost:''' ====<br />
[[File:Image.png|thumb|Pure psuedo cost branching]]<br />
Another way to approximate a relaxation value is by utilizing a pseudo cost method. The pseudo-cost of a variable is an estimate of the per unit change in the objective function from making the value of the variable to be rounded up or down. For each variable we choose variable with the largest estimated LP objective gain. <br />
==='''Algorithm'''===<br />
Branch and Cut for is a variation of the Branch and Bound algorithm. Branch and Cut incorporates Gomery cuts allowing the search space of the given problem. The standard Simplex Algorithm will be used to solve each Integer Linear Programming Problem (LP).<br />
<br />
<br />
<math>min: c^tx<br />
</math><br />
<br />
<math>s.t. Ax < b<br />
</math><br />
<br />
<math>x \geq 0<br />
</math><br />
<br />
<math>x_i = int, i = 1,2,3...,n<br />
</math><br />
<br />
Above is a mix-integer linear programming problem. x and c are a part of the n-vector. These variables can be set to 0 or 1 allow binary variables. The above problem can be denoted as <math>LP_n </math><br />
<br />
Below is an Algorithm to utilize the Branch and Cut algorithm with Gomery cuts and Partitioning:<br />
<br />
'''Step 0:'''<br />
Upper Bound = ∞<br />
Lower Bound = -∞<br />
'''Step 1. Initialize:'''<br />
<br />
Set the first node as <math>LP_0</math> while setting the active nodes set as <math>L</math>. The set can be accessed via <math>LP_n </math><br />
<br />
===='''Step 2. Terminate:'''====<br />
Step 3. Iterate through list L:<br />
<br />
While <math>L</math> is not empty (i is the index of the list of L), then:<br />
<br />
'''Step 3.1. Convert to a Relaxation:'''<br />
<br />
'''Solve 3.2.'''<br />
<br />
Solve for the Relaxed<br />
<br />
'''Step 3.3.'''<br />
If Z is infeasible:<br />
Return to step 3.<br />
else:<br />
Continue with solution Z.<br />
'''Step 4. Cutting Planes:'''<br />
If a cutting plane is found:<br />
then add to the linear Relaxation problem (as a constraint) and return to step 3.2<br />
Else:<br />
Continue.<br />
'''Step 5. Pruning and Fathoming:'''<br />
<br />
(a)If ≥ Z:, then go to step 3.<br />
If Z^l <= Z AND X_i is an integral feasible:<br />
Z = Z^i<br />
Remove all Z^i from Set(L)<br />
'''Step 6. Partition'''<br />
<br />
Let <math>D^{lj=k}_{j=1}</math> be a partition of the constraint set <math>D</math> of problem <math>LP_l</math>. Add problems <math>D^{lj=k}_{j=1}</math> to L, where <math>LP^l_j</math> is <math>LP_l</math> with feasible region restricted to <math>D^l_j</math> and <math>Z_{lj}</math> for j=1,...k is set to the value of <math>Z^l</math> for the parent problem l. Go to step 3.<br />
<br />
==Numerical Example==<br />
First, list out the problem:<br />
<br />
min z = -8x1 - 7x2 <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
x<sub>1</sub>, x<sub>2</sub> ≥ 0 <br />
<br />
First Solution of MILP = Min = -32.625 (x<sub>1</sub> = 2.875, x<sub>2</sub> = 1.3750) <br />
<br />
Branch: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub> <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Second Solution of MILP = Min = -24.4 (x1 = 2, x2 = 1.2) <br />
<br />
Branch: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub> <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Add Cut: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub><br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
2x<sub>1</sub> ≤ 6 (Cut Dividing equation 1 by 2)<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Solution with cut = Min = -24.4 (x<sub>1</sub> = 2, x<sub>2</sub> = 1.2)<br />
<br />
==Application==<br />
Several of the Branch and Cut applications are described below in more detail and how they can be used. These applications serve as methods in which Branch and Cut can be used to optimize various problems efficiently.<br />
<br />
=== '''Combinatorial Optimization''' ===<br />
Combinatorial Optimization is a great application for Branch and Cut. This style of optimization is the methodology of utilizing the finite known sets and information of the sets to optimize the solution. The original intent for this application was for maximizing flow as well as in the transportation industry. This combinatorial optimization has also taken on some new areas where it is used often. Combinatorial Optimization is now an imperative component in studying artificial intelligence and machine learning algorithms to optimize solutions. The finite sets that Combinatorial Optimization tends to utilize and focus on includes graphs, partially ordered sets, and structures that define linear independence call matroids.<br />
<br />
=== '''Bender’s Decomposition''' ===<br />
Bender’s Decomposition is another Branch and Cut application that is utilized widely in Stochastic Programming. Bender’s Decomposition is where you take the initial problem and divide into two distinct subsets. By dividing the problem into two separate problems you are able to solve each set easier than the original instance. Therefore the first problem within the subset created can be solved for the first variable set. The second sub problem is then solved for, given that first problem solution. Doing this allows for the sub problem to be solved to determine whether the first problem is infeasible. Bender’s cuts can be added to constrain the problem until a feasible solution can be found.<br />
<br />
=== '''Large-Scale Symmetric Traveling Salesmen Problem''' ===<br />
The Large-Scale Symmetric Traveling Salesmen Problem is a common problem that was always looked into optimizing for the shortest route while visiting each city once and returning to the original city at the end. On a larger scale this style of problem must be broken down into subsets or nodes. By constraining this style of problem such as the methods of Combinatorial Optimization, the Traveling Salesmen Problem can be viewed as partially ordered sets. By doing this on a large scale with finite cities you are able to optimize the shortest path taken and ensure each city is only visited once.<br />
<br />
=== '''Submodular Function''' ===<br />
Submodular Function is another function in which is used throughout artificial intelligence as well as machine learning. The reason for this is because as inputs are increased into the function the value or outputs decrease. This allows for a great optimization features in the cases stated above because inputs are continually growing. This allows for machine learning and artificial intelligence to continue to grow based on these algorithms. By enforcing new inputs to the system the system will learn more and more to ensure it optimizes the solution that is to be made.<br />
<br />
==Conclusion==<br />
The Branch and Cut is an optimization algorithm used to optimize integer linear programming. It combines two other optimization algorithms - branch and bound and cutting planes in order to utilize the results from each method in order to create the most optimal solution. There are three different methodologies used within the specific method - most infeasible branching, strong branching, and pseudo code. Furthermore, Branch and Cut can be utilized it multiple scenarios - Submodular function, large-scale symmetric traveling salesmen problem, bender's decomposition, and combination optimization which increases the impact of the methodology. <br />
<br />
==References==<br />
<br />
# A. Krause and C. Guestrin, Beyond Convexity: Submodularity in Machine Learning, Tutorial at ICML-2008<br />
# Benders, J. F. (Sept. 1962), "Partitioning procedures for solving mixed-variables programming problems", Numerische Mathematik 4(3): 238–252.<br />
# Karamanov, Miroslav. “Branch and Cut: An Empirical Study.” ''Carnegie Mellon University'' , Sept. 2006, https://www.cmu.edu/tepper/programs/phd/program/assets/dissertations/2006-operations-research-karamanov-dissertation.pdf.<br />
# Luedtke, Jim. “The Branch-and-Cut Algorithm for Solving Mixed-Integer Optimization Problems.” ''Institute for Mathematicians and Its Applications'', 10 Aug. 2016, https://www.ima.umn.edu/materials/2015-2016/ND8.1-12.16/25397/Luedtke-mip-bnc-forms.pdf.<br />
# Maltby, Henry, and Eli Ross. “Combinatorial Optimization.” ''Brilliant Math & Science Wiki'', https://brilliant.org/wiki/combinatorial-optimization/.<br />
# Society for Industrial and Applied Mathematics. “SIAM Rev.” ''SIAM Review'', 18 July 2006, https://epubs.siam.org/doi/10.1137/1033004.<br />
# S. Tschiatschek, R. Iyer, H. Wei and J. Bilmes, Learning Mixtures of Submodular Functions for Image Collection Summarization, NIPS-2014.<br />
#</div>JonBoisverthttps://optimization.cbe.cornell.edu/index.php?title=Branch_and_cut&diff=2283Branch and cut2020-12-11T02:28:27Z<p>JonBoisvert: /* Bender’s Decomposition */</p>
<hr />
<div>Author: Lindsay Siegmundt, Peter Haddad, Chris Babbington, Jon Boisvert, Haris Shaikh (SysEn 6800 Fall 2020)<br />
<br />
Steward: Wei-Han Chen, Fengqi You<br />
<br />
== Introduction ==<br />
The Branch and Cut methodology was discovered in the 90s as a way to solve/optimize Mixed-Integer Linear Programs (Karamanov, Miroslav). This concept is comprised of two known optimization methodologies - Branch and Bound and Cutting Planes. Utilizing these two tools allows for the Branch and Cut to find an optimal solution through relaxing the problem to produce the upper bound. Relaxing the problem allows for the complex problem to be simplified in order for it to be solve more easily. Furthermore, the upper bound represents the highest value the objective can take in order to be feasible. The optimal solution is found when the objective is equal to the upper bound (Luedtke, Jim). This methodology is critical to the future of optimization since it combines two common tools in order to utilize each component in order to find the optimal solution. Moving forward, the critical components of different methodologies could be combined in order to find optimality in a more simple and direct manner. <br />
<br />
== Methodology & Algorithm ==<br />
<br />
=== Methodology ===<br />
{| class="wikitable"<br />
|+Abbreviation Details<br />
!Acronym<br />
!Expansion<br />
|-<br />
|LP<br />
|Linear Programming<br />
|-<br />
|B&B<br />
|Branch and Bound<br />
|}<br />
<br />
==== Most Infeasible Branching: ====<br />
Most infeasible branching is a very popular method that picks the variable with fractional part closest to <math>0:5</math>, i.e.,<math> si = 0:5-|xA_i- xA_i-0:5|</math>. Most infeasible branching picks a variable where the least tendency can be recognized to which side the variable should be rounded. However, the performance of this method is not any superior to the rule of selecting a variable randomly.<br />
<br />
==== '''Strong Branching:''' ====<br />
For each fractional variable, strong branching tests the dual bound increase by computing the LP relaxations result from the branching on that variable. As a branching variable for the current for the current node, the variable that leads to the largest increases is selected. Despite its obvious simplicity, strong branching is so far the most powerful branching technique in terms of the number of nodes available in the B&B tree, this effectiveness can however be accomplished only at the cost of computation.<br />
<br />
==== '''Pseudo Cost:''' ====<br />
[[File:Image.png|thumb|Pure psuedo cost branching]]<br />
Another way to approximate a relaxation value is by utilizing a pseudo cost method. The pseudo-cost of a variable is an estimate of the per unit change in the objective function from making the value of the variable to be rounded up or down. For each variable we choose variable with the largest estimated LP objective gain. <br />
==='''Algorithm'''===<br />
Branch and Cut for is a variation of the Branch and Bound algorithm. Branch and Cut incorporates Gomery cuts allowing the search space of the given problem. The standard Simplex Algorithm will be used to solve each Integer Linear Programming Problem (LP).<br />
<br />
<br />
<math>min: c^tx<br />
</math><br />
<br />
<math>s.t. Ax < b<br />
</math><br />
<br />
<math>x \geq 0<br />
</math><br />
<br />
<math>x_i = int, i = 1,2,3...,n<br />
</math><br />
<br />
Above is a mix-integer linear programming problem. x and c are a part of the n-vector. These variables can be set to 0 or 1 allow binary variables. The above problem can be denoted as <math>LP_n </math><br />
<br />
Below is an Algorithm to utilize the Branch and Cut algorithm with Gomery cuts and Partitioning:<br />
<br />
'''Step 0:'''<br />
Upper Bound = ∞<br />
Lower Bound = -∞<br />
'''Step 1. Initialize:'''<br />
<br />
Set the first node as <math>LP_0</math> while setting the active nodes set as <math>L</math>. The set can be accessed via <math>LP_n </math><br />
<br />
===='''Step 2. Terminate:'''====<br />
Step 3. Iterate through list L:<br />
<br />
While <math>L</math> is not empty (i is the index of the list of L), then:<br />
<br />
'''Step 3.1. Convert to a Relaxation:'''<br />
<br />
'''Solve 3.2.'''<br />
<br />
Solve for the Relaxed<br />
<br />
'''Step 3.3.'''<br />
If Z is infeasible:<br />
Return to step 3.<br />
else:<br />
Continue with solution Z.<br />
'''Step 4. Cutting Planes:'''<br />
If a cutting plane is found:<br />
then add to the linear Relaxation problem (as a constraint) and return to step 3.2<br />
Else:<br />
Continue.<br />
'''Step 5. Pruning and Fathoming:'''<br />
<br />
(a)If ≥ Z:, then go to step 3.<br />
If Z^l <= Z AND X_i is an integral feasible:<br />
Z = Z^i<br />
Remove all Z^i from Set(L)<br />
'''Step 6. Partition'''<br />
<br />
Let <math>D^{lj=k}_{j=1}</math> be a partition of the constraint set <math>D</math> of problem <math>LP_l</math>. Add problems <math>D^{lj=k}_{j=1}</math> to L, where <math>LP^l_j</math> is <math>LP_l</math> with feasible region restricted to <math>D^l_j</math> and <math>Z_{lj}</math> for j=1,...k is set to the value of <math>Z^l</math> for the parent problem l. Go to step 3.<br />
<br />
==Numerical Example==<br />
First, list out the problem:<br />
<br />
min z = -8x1 - 7x2 <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
x<sub>1</sub>, x<sub>2</sub> ≥ 0 <br />
<br />
First Solution of MILP = Min = -32.625 (x<sub>1</sub> = 2.875, x<sub>2</sub> = 1.3750) <br />
<br />
Branch: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub> <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Second Solution of MILP = Min = -24.4 (x1 = 2, x2 = 1.2) <br />
<br />
Branch: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub> <br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Add Cut: <br />
<br />
min z = -8x<sub>1</sub> - 7x<sub>2</sub><br />
<br />
s.t. 5x<sub>1</sub> - x<sub>2</sub> ≤ 13<br />
<br />
-x<sub>1</sub> + 5x<sub>2</sub> ≤ 4<br />
<br />
2x<sub>1</sub> ≤ 6 (Cut Dividing equation 1 by 2)<br />
<br />
x<sub>1</sub> ≤ 2<br />
<br />
x<sub>2</sub> ≥ 0 <br />
<br />
Solution with cut = Min = -24.4 (x<sub>1</sub> = 2, x<sub>2</sub> = 1.2)<br />
<br />
==Application==<br />
Several of the Branch and Cut applications are described below in more detail and how they can be used. These applications serve as methods in which Branch and Cut can be used to optimize various problems efficiently.<br />
<br />
=== '''Combinatorial Optimization''' ===<br />
Combinatorial Optimization is a great application for Branch and Cut. This style of optimization is the methodology of utilizing the finite known sets and information of the sets to optimize the solution. The original intent for this application was for maximizing flow as well as in the transportation industry. This combinatorial optimization has also taken on some new areas where it is used often. Combinatorial Optimization is now an imperative component in studying artificial intelligence and machine learning algorithms to optimize solutions. The finite sets that Combinatorial Optimization tends to utilize and focus on includes graphs, partially ordered sets, and structures that define linear independence call matroids.<br />
<br />
=== '''Bender’s Decomposition''' ===<br />
Bender’s Decomposition is another Branch and Cut application that is utilized widely in Stochastic Programming. Bender’s Decomposition is where you take the initial problem and divide into two distinct subsets. By dividing the problem into two separate problems you are able to solve each set easier than the original instance. Therefore the first problem within the subset created can be solved for the first variable set. The second sub problem is then solved for, given that first problem solution. Doing this allows for the sub problem to be solved to determine whether the first problem is infeasible. Bender’s cuts can be added to constrain the problem until a feasible solution can be found.<br />
<br />
=== '''Large-Scale Symmetric Traveling Salesmen Problem''' ===<br />
The Large-Scale Symmetric Traveling Salesmen Problem is a common problem that was always looked into optimizing for the shortest route while visiting each city once and returning to the original city at the end. On a larger scale this style of problem must be broken down into subsets or nodes. By constraining this style of problem such as the methods of Combinatorial Optimization, the Traveling Salesmen Problem can viewed as partially ordered sets. By doing this on a large scale with finite cities you are able to optimize the shortest path taken and ensure each city is only visited once.<br />
<br />
=== '''Submodular Function''' ===<br />
Submodular Function is another function in which is used throughout artificial intelligence as well as machine learning. The reason for this is because as inputs are increased into the function the value or outputs decrease. This allows for a great optimization features in the cases stated above because inputs are continually growing. This allows for machine learning and artificial intelligence to continue to grow based on these algorithms. By enforcing new inputs to the system the system will learn more and more to ensure it optimizes the solution that is to be made.<br />
<br />
==Conclusion==<br />
The Branch and Cut is an optimization algorithm used to optimize integer linear programming. It combines two other optimization algorithms - branch and bound and cutting planes in order to utilize the results from each method in order to create the most optimal solution. There are three different methodologies used within the specific method - most infeasible branching, strong branching, and pseudo code. Furthermore, Branch and Cut can be utilized it multiple scenarios - Submodular function, large-scale symmetric traveling salesmen problem, bender's decomposition, and combination optimization which increases the impact of the methodology. <br />
<br />
==References==<br />
<br />
# A. Krause and C. Guestrin, Beyond Convexity: Submodularity in Machine Learning, Tutorial at ICML-2008<br />
# Benders, J. F. (Sept. 1962), "Partitioning procedures for solving mixed-variables programming problems", Numerische Mathematik 4(3): 238–252.<br />
# Karamanov, Miroslav. “Branch and Cut: An Empirical Study.” ''Carnegie Mellon University'' , Sept. 2006, https://www.cmu.edu/tepper/programs/phd/program/assets/dissertations/2006-operations-research-karamanov-dissertation.pdf.<br />
# Luedtke, Jim. “The Branch-and-Cut Algorithm for Solving Mixed-Integer Optimization Problems.” ''Institute for Mathematicians and Its Applications'', 10 Aug. 2016, https://www.ima.umn.edu/materials/2015-2016/ND8.1-12.16/25397/Luedtke-mip-bnc-forms.pdf.<br />
# Maltby, Henry, and Eli Ross. “Combinatorial Optimization.” ''Brilliant Math & Science Wiki'', https://brilliant.org/wiki/combinatorial-optimization/.<br />
# Society for Industrial and Applied Mathematics. “SIAM Rev.” ''SIAM Review'', 18 July 2006, https://epubs.siam.org/doi/10.1137/1033004.<br />
# S. Tschiatschek, R. Iyer, H. Wei and J. Bilmes, Learning Mixtures of Submodular Functions for Image Collection Summarization, NIPS-2014.<br />
#</div>JonBoisverthttps://optimization.cbe.cornell.edu/index.php?title=Branch_and_cut&diff=1471Branch and cut2020-11-21T19:58:40Z<p>JonBoisvert: /* References */</p>
<hr />
<div>Author: Lindsay Siegmundt, Peter Haddad, Chris Babbington, Jon Boisvert, Haris Shaikh (SysEn 6800 Fall 2020)<br />
<br />
Steward: Wei-Han Chen, Fengqi You<br />
<br />
== Introduction ==<br />
The Branch and Cut is a methodology that is used to optimize linear problems that are integer based. This concept is comprised of two known optimization methodologies - branch and bound and cutting planes. Utilizing these tools allows for the Branch and Cut to be successful by increasing the relaxation and decreasing the lower bound. The ultimate goal of this technique is to minimize the amount of nodes.<br />
<br />
== Methodology & Algorithm ==<br />
<br />
=== Methodology - Haris ===<br />
'''Most infeasible branching:'''<br />
<br />
Most infeasible branching is a very popular method that picks the variable with fractional part closest to 0:5, i.e., si = 0:5 − |xAi − xAi− 0:5|. Most infeasible branching picks a variable where the least tendency can be recognized to which side the variable should be rounded. However, the performance of this method is not any superior to the rule of selecting a variable randomly.<br />
<br />
'''Strong Branching:'''<br />
<br />
For each fractional variable, strong branching tests the dual bound increase by computing the LP relaxations result from the branching on that variable. As a branching variable for the current for the current node, the variable that leads to the largest increases is selected. Despite its obvious simplicity, strong branching is so far the most powerful branching technique in terms of the number of nodes available in the B&B tree, this effectiveness can however be accomplished only at the cost of computation.<br />
<br />
'''Pseudo cost:'''<br />
[[File:Image.png|thumb|Pure psuedo cost branching]]<br />
Another way to approximate a relaxation value is by utilizing a pseudo cost method. The pseudo-cost of a variable is an estimate of the per unit change in the objective function from making the value of the variable to be rounded up or down. For each variable we choose variable with the largest estimated LP objective gain. <br />
<br />
<br />
==='''Algorithm - Peter'''Branch and Cut for is a variation of the Branch and Bound algorithm. Branch and Cut incorporates Gomery cuts allowing the search space of the ===<br />
===given problem. The standard Simplex Algorithm will be used to solve each Integer Linear Programming Problem (LP).===<br />
Below is an Algorithm to utilize the Branch and Cut algorithm with Gomery cuts and Partitioning:<br />
<br />
====Step 0:====<br />
Upper Bound = ∞<br />
Lower Bound = -∞<br />
<br />
====Step 1. Initialize:====<br />
Set the first node as <math>LP_0</math> while setting the active nodes set as <math>L</math>. The set can be accessed via <math>LP_n </math><br />
<br />
====Step 2. Terminate:====<br />
<br />
====Step 3. Iterate through list L:====<br />
While <math>L</math> is not empty (i is the index of the list of L), then:<br />
<br />
=====Step 3.1. Convert to a Relaxation:=====<br />
=====Solve 3.2.=====<br />
Solve for the Relaxed<br />
<br />
=====Step 3.3.=====<br />
If Z is infeasible:<br />
Return to step 3.<br />
else:<br />
Continue with solution Z.<br />
Step 3.2. Cutting Planes:<br />
If a cutting plane is found:<br />
then add to the linear Relaxation problem (as a constraint) and return to step 3.2<br />
Else:<br />
Continue.<br />
<br />
====Step 4. Pruning and Fathoming:====<br />
If Z^l >= Z:<br />
return to step 3.<br />
<br />
If Z^l <= Z AND X_i is an integral feasible:<br />
Z = Z^i<br />
Remove all Z^i from Set(L)<br />
<br />
====Step 5. Partition (reference EXTERNAL SOURCE)====<br />
Let <math>D^{lj=k}_{j=1}</math> be a partition of the constraint set <math>D</math> of problem <math>LP_l</math>. Add problems <math>D^{lj=k}_{j=1}</math> to L, where <math>LP^l_j</math> is <math>LP_l</math> with feasible region restricted to <math>D^l_j</math> and <math>Z_{lj}</math> for j=1,...k is set to the value of <math>Z^l</math> for the parent problem l. Go to step 3.<br />
<br />
==Numerical Example - Chris==<br />
Below is a simple example of how branch and cut is used for Optimization problems:<br />
<br />
2.5x1 + 5x2 = 15<br />
<br />
Find common denominator to minimize the function and get rid of any fractions (if possible)<br />
<br />
(2.5 identified as good common denominator - Formula becomes)<br />
<br />
x1 + 2x2 = 6<br />
<br />
This is the first cut<br />
<br />
min z = x1 + x2 <br />
<br />
S.T. x1 + x2 <= 5<br />
<br />
x1 - x2 <= 13<br />
<br />
x1, x2 >= 0 <br />
<br />
<br />
<br />
==Application - Jon==<br />
Several of the Branch and Cut applications are described below in more detail and how they can be used. These applications serve as methods in which Branch and Cut can be used to optimize various problems efficiently.<br />
<br />
'''Combinatorial Optimization'''<br />
<br />
Combinatorial Optimization is a great application for Branch and Cut. This style of optimization is the methodology of utilizing the finite known sets and information of the sets to optimize the solution. The original intent for this application was for maximizing flow as well as in the transportation industry. This combinatorial optimization has also taken on some new areas where it is used often. Combinatorial Optimization is now an imperative component in studying artificial intelligence and machine learning algorithms to optimize solutions. The finite sets that Combinatorial Optimization tends to utilize and focus on includes graphs, partially ordered sets, and structures that define linear independence call matroids.<br />
<br />
'''Bender’s Decomposition'''<br />
<br />
Bender’s Decomposition is a great use in Stochastic Programming instances. Bender’s Decomposition is where you take the initial problem and divide into two distinct subsets. By dividing the problem into two separate problems you are able to solve each set easier than the original instance. Therefore the first problem within the subset created can be solved for the first variable set. The second sub problem is then solved for, given that first problem solution. Doing this allows for the sub problem to be solved to determine whether the first problem is infeasible. Bender’s cuts can be added to constrain the problem until a feasible solution can be found.<br />
<br />
'''Large-Scale Symmetric Traveling Salesmen Problem'''<br />
<br />
The Large-Scale Symmetric Traveling Salesmen Problem is a common problem that was always looked into optimizing for the shortest route while visiting each city once and returning to the original city at the end. On a larger scale this style of problem must be broken down into subsets or nodes. By constraining this style of problem such as the methods of Combinatorial Optimization, the Traveling Salesmen Problem can viewed as partially ordered sets. By doing this on a large scale with finite cities you are able to optimize the shortest path taken and ensure each city is only visited once.<br />
<br />
'''Submodular Function'''<br />
<br />
Submodular Function is another function in which is used throughout artificial intelligence as well as machine learning. The reason for this is because as inputs are increased into the function the value or outputs decrease. This allows for a great optimization features in the cases stated above because inputs are continually growing. This allows for machine learning and artificial intelligence to continue to grow based on these algorithms. By enforcing new inputs to the system the system will learn more and more to ensure it optimizes the solution that is to be made.<br />
<br />
==Conclusion - Lindsay==<br />
<br />
==References==<br />
https://optimization.mccormick.northwestern.edu/index.php/Branch_and_cut<br />
<br />
- Society for Industrial and Applied Mathematics. “SIAM Rev.” ''SIAM Review'', 18 July 2006, epubs.siam.org/doi/10.1137/1033004.<br />
<br />
- A. Krause and C. Guestrin, Beyond Convexity: Submodularity in Machine Learning, Tutorial at ICML-2008<br />
<br />
- S. Tschiatschek, R. Iyer, H. Wei and J. Bilmes, Learning Mixtures of Submodular Functions for Image Collection Summarization, NIPS-2014.<br />
<br />
- Maltby, Henry, and Eli Ross. “Combinatorial Optimization.” ''Brilliant Math & Science Wiki'', brilliant.org/wiki/combinatorial-optimization/.<br />
<br />
- Benders, J. F. (Sept. 1962), "Partitioning procedures for solving mixed-variables programming problems", Numerische Mathematik 4(3): 238–252.</div>JonBoisvert