# Difference between revisions of "Chance-constraint method"

Authors: Anthony Pacheco, Bryan Bausinger, John Maher, Martin Lis, Trevor Spaulding (SYSEN 5800 Fall 2021)

## Introduction

The chance-constraint method of optimization programming is a process for working with random parameters within a problem while guaranteeing a certain performance. Uncertain variables in a project lead to questions regarding reliability and risk which make for difficulties in determining the most likely result. In stochastic optimization, expected or nominal values are used for these uncertainties. While there is some risk in making these stochastic optimization decisions, they are usually considered standard since they were already previously accounted for; a known "penalty". In practice, however, these penalties are not always known entities, such as extreme weather events. In these unknown cases, it is still important to plan for these events to occur and make certain decisions around them. A common line of thinking here is to allow for certain unexpected events that violate specific constraints as long as the overall constraint satisfaction is maintained with a given level of probability. In other words, certain levels of feasibility are guaranteed and these are defined as chance constraints. For example, a distributor may guarantee a certain amount of product will be delivered while taking into account certain chance constraints such as: weather events, supply chain issues, and/or a flat tire on a delivery truck. A system's performance can be optimized with uncertain constraints via the chance-constraint optimization method by accounting for these constraints and ensuring they satisfy some well-defined reliability values.[1][2]

## Theory/Methodology

### General Optimization Problem

NOTE TO ADD MORE: want to add a brief introduction of a 'basic' optimization problem, which leads into a discussion of the random vector and then into chance constraints and then the chance constraint optimization problem; see Reference 3

A general form of an optimization problem is shown below:[3]

${\displaystyle min\quad g(x)}$

${\displaystyle s.t.\quad h_{1}(x,\xi )\leq 0,...\,,h_{d}(x,\xi )\leq 0,x\in X}$

${\displaystyle where\,g\,is\,the\,objective\,function,h_{1},...\,,h_{d},areinequality\,constraints\,X\,\in \mathbb {R} ^{m}\,is\,a\,deterministic\,set,x\in X\,is\,the\,decision\,vector,\xi \,is\,a\,parameter\,vector,}$

The above general form of the constrained optimization problem has multiple real world applications. However, in the real world the parameter ${\displaystyle \xi }$ is often uncertain and introduces a level of randomness to the optimization problem. Sometimes this uncertainty is bypassed by using the expected value of ${\displaystyle \xi }$ which can cause issues as the optimal solution to the above optimization problem may have a high chance of being infeasible. Therefore, the random vector needs to be accounted for depending on the optimization application.[3]

### The Random Vector

The random vector, ${\displaystyle \xi }$, can both be dependent on time or time-independent. The wide ranging possibilities of the random vector play the largest role in the difficult nature of the chance-constrained optimization problem. Though generally speaking, the random vector is considered to have a known distribution.[2] For example, the chance constraints can take the linear form in the random vector. This is illustrated in both of the constraint defining equations below [1]:

${\displaystyle h(x,\xi )=g(x)-A\xi \quad or\quad h(x,\xi )=A(\xi )g(x)-b}$

${\displaystyle where\,A\,is\,a\,deterministic\,matrix,A(\xi )\,is\,a\,stochastic\,matrix,x\,is\,a\,decision\,variable,g\,is\,a\,mapping\,on\,x,and\,b\,is\,a\,vector\,of\,relative\,appropriate\,size}$

While both instances of these constraint definitions linear, the first instance shows the random vector as distinguished from the decision vector and the second instance has the random vector directly associated with the decision vector. Each of these types of the random vector within the constraint mapping have real-world applications - most commonly in engineering and energy management.[1]

### General Chance Constrained Optimization

The chance-constrained optimization method was first created by Charnes, Cooper, and Symmonds in 1958-1959 relating to financial planning optimization. Over the years, improvements have been made to the chance-constrained optimization theory and computation processing methods, most notably by András Prékopa.[2]

A general form of a chance constraint is shown below:

${\displaystyle \mathbb {P} (h(x,\xi )\geq 0)\geq p}$

${\displaystyle h(x,\xi )\geq 0\,maps\,the\,constraints\,as\,a\,finite\,system\,of\,inequalities,and\,p\in [0,1]}$

This chance constraint formulated above is considered to be a constraint within an optimization problem where this is some objective function - usually denoted by ${\displaystyle f(x)}$ - that is meant to be minimized.

Compared to a conventional optimization problem, the chance-constrained optimization problem differs due to the left-hand side inequality above because it is rarely provided explicitly. Therefore, there is not a general solution approach for chance-constrained optimization problems. The random and decision vectors in the above equation can guide the path to solution via a linear programming solution, but not always.[1]

### Joint vs. Individual Chance Constraints

The chance constraint above can be written more explicitly as:

${\displaystyle \mathbb {P} (h_{j}(x,\xi )\geq 0(j=1,...,m))\geq p}$

This is called a joint chance constraint (or sometimes a separate chance constraint,[2]) because the entire probability covers the entirety of the inequality. In simpler terms, the entire constraint is considered as a whole. On the other hand, if each of the constraints in the system need to be considered individually, the chance constraint becomes a single constraint, which is formulated below:

${\displaystyle \mathbb {P} (h_{j}(x,\xi )\geq 0)\geq p(j=1,...,m)}$

This type of constraint is known as an individual chance constraint (and sometimes as a single chance constraint,[2]).

The benefit of working with the this latter individual chance constraint is that it can be easier to solve because each of the inequalities are solved independently compared to an extremely complex single inequality. However, in a system with a large amount of constraints, this generates an equally large amount of inequalities due to the one-to-one ratio with individual chance constraint relative to the single inequality within the joint chance constraint. This is therefore one of the pros of working with joint chance constraints as there is solely one inequality to work with as that inequality defines all of the constraints within the system. Although in comparison to the individual chance constraint, the joint chance constraint is sometimes more difficult to solve numerically because the system has to be considered as an overall whole instead of part-wise.[1]

The joint and individual chance constraint models can be used to support one another if properly utilized. For instance, a system may be modeled by being broken down into joint chance constraints that are then used to calculate the minimum and maximum optimal value bounds for the individual chance constraint of the overall system. So if ${\displaystyle x}$ is permissible in a joint chance constraint of a system, then ${\displaystyle x}$ would also be allowable within the individual chance constraint of that same system.[1]

Caution should be taken when utilizing solely individual chance constraints in an attempt to overly simplify a problem. It can be appealing to attempt to model an entire system as a single inequality, especially if a certain probability level of satisfaction is warranted over a specific amount of time. However, this modeling may fail in robustness when considering the system throughout the entire timeframe. As a brief example, if a system is required to reach a certain probability level over a lengthy timeframe, it may be appealing to model the system using individual chance constraints over multiple shorter time slots. The system is then modeled solely over those separate time slots to ensure the certain probability level is satisfied in piecemeal fashion. While this may result in a more easy-to-solve problem, this individual chance constraint modeling setup may breakdown when expanding over the entire timeframe instead of focusing in on the separate time slots. So in instances such as these, joint chance constraints may cause a lengthier problem and solution, but will result in a more robust answer.[4]

NOTE TO ADD MORE: A brief discussion on the linear vs. nonlinear solution approaches; NOTE TO ADD MORE: Perhaps a section on convexity - see Section 2.4 of Reference 1 and Section 4.1 of Reference 2. Would also have to discuss the "random right-side" topic; Note - this is a bit confusing and not necessarily sure what it adds to this section

## Numerical Example

To explain chance constraints in a simplified example, consider a refinery that is able to take two raw materials (${\displaystyle raw_{1}}$, ${\displaystyle raw_{2}}$) and simultaneously produce two different goods (prod1, prod2). To understand the production costs (${\displaystyle \gamma }$), the generated outputs must be related to the amount of raw material it took to produce (${\displaystyle c_{raw1}}$, ${\displaystyle c_{raw2}}$). Additionally, the demand for each product generated (${\displaystyle h_{prod1}}$, ${\displaystyle h_{prod2}}$) and the production capacity (${\displaystyle h_{total}}$) based on the constraints of the refinery must be understood. Below is a condensed table with the production constraints and costs as well as a formulation of these constraints as a linear program.

Products
Raws prod1 prod2 c b
raw1 2 3 2 1
raw2 6 3 3 1
relation =
h 180 162 γ 100

${\displaystyle min\quad 2x_{raw1}+3x_{raw2}}$

${\displaystyle s.t.\quad x_{raw1}+x_{raw2}\leq 100}$,

${\displaystyle 2x_{raw1}+6x_{raw2}\geq 180}$,

${\displaystyle 3x_{raw1}+3x_{raw2}\geq 162}$,

${\displaystyle x_{raw1}\geq 0}$,

${\displaystyle x_{raw2}\geq 0}$.

Graphing the objective function and constraints allows for us to immediately see the feasible region where all constraints are satisfied.

Visually analyzing the feasible region, it can be concluded that the following is the optimal solution to the problem:

${\displaystyle {\hat {x}}_{raw1}=36,{\hat {x}}_{raw2}=18,\gamma ({\hat {x}})=126}$

This is an oversimplification of this problem as all the variables are assumed to be fixed. For example, although the unit cost of the raw materials may be understood, these can be affected by natural disasters (fires or hurricanes destroying materials, supply chains disrupting supply/demand), politics (sanctions) and other factors. Although these macro events may have little impact on immediate material costs, it lowers the probability that your assumption of a fixed price will be accurate. More impactfully, demand of the various products is more difficult to predict and therefore has higher uncertainty. Additionally, the production capability at any given time is not fixed. For example, there may be issues with machines (repairs, calibrations etc..) or even the machine operators (illness, vacation resulting in not enough operators to run all machines at full capacity). To make this problem more realistic, let's assume:

• Productivities of prod1, prod2 vary randomly
• Weekly demands ${\displaystyle h_{prod1}}$, ${\displaystyle h_{prod2}}$ vary randomly
• Before manufacture, weekly production plan is fixed and actual production of prod1, prod2 can only be measured during production
• Clients expect actual demand to be satisfied

The more simplistic linear program defined above now turns into a more complex stochastic linear program:

${\displaystyle min\quad 2x_{raw1}+3x_{raw2}}$

${\displaystyle s.t.\quad x_{raw1}+x_{raw2}\leq 100}$,

${\displaystyle (2+{\tilde {\eta }}_{1})x_{raw1}+6x_{raw2}\geq 180+{\tilde {\zeta }}_{1}}$,

${\displaystyle 3x_{raw1}+(3.4-{\tilde {\eta }}_{2})x_{raw2}\geq 162+{\tilde {\zeta }}_{2}}$,

${\displaystyle x_{raw1}\geq 0}$,

${\displaystyle x_{raw2}\geq 0}$.

${\displaystyle Where\,{\tilde {\zeta }}_{1}\,is\,a\,random\,variable\,modeled\,using\,normal\,distributions\,and\,{{\tilde {\eta }}_{1}},{{\tilde {\eta }}_{2}}\,are\,uniform,exponential\,distributions}$

Adjusting the value of ${\displaystyle {\frac {\tilde {\eta _{1}}}{\tilde {\eta _{2}}}}}$ based on a desired confidence interval, the effect that on the feasible region can be seen.

The difficulty in implementing a problem like this is that a production plan must be determined based off of the uncertainty of the situation, and how much risk one is willing to accept. There does exist a “safe” solution, in which the feasible region is determined based on the requirements to satisfy every constraint, even at different probabilities.

Although this may be the safest solution, it may not be optimal as you may procure not enough raw materials to meet demand that exceeds your “safe” solution. Additional complexities and optimizations can be considered to improve the optimized solution but will be omitted for this example.[5]

## Applications

The chance-constraint optimization approach can be extremely beneficial in any application where there are unknown variables, but with probabilities that can be associated with each variable. Some examples of real-world applications are as follows:

### Stock Portfolio

Chance-constrained methods are often used as a risk aversion technique for designing a stock portfolio. Within this method, there is a general predicted value for the final wealth and an associated probability of survival to this method.[6]

### Utility Demand (Power/Gas/Water)

Energy creation, particularly in renewable sources, has inherently high variabilities within examples such as solar or wind. It is no longer possible to fully predict energy generation as in the past. Therefore, there is a need for a probabilistic model to provide and distribute energy to all homes to a sufficient certainty. Hu et al. analyzes the economic dispatch of wind energy to meet the dispatch needs in very large instances using the chance-constraint approach.[7]

### Transportation

Intermodal Freight Transportation (IFT) is the transportation of a material from one place to another involving at least two medians of transportation (i.e.. sea, air, road). Each of the shipping modalities contains its own stochastic factors such as priority, travel time, and availability. IFT provides the opportunity for faster shipment times, lower costs, improved throughput, and broader market availability. Zhao et al. modeled a two-stage chance-constrained approach with the stages being sea and rail, respectively. A network of available stations (rail), seaports (sea), and hubs (both) is modeled to create the available paths for goods. Their research shows the chance-constrained methods allows for different optimal service conditions. Factoring in the varying probability of shipment duration allows to incorporate late cost considerations and therefore adds in more variability in design.[8]

### Water Reservoir Management

Clean, unpolluted water has become a more scarce resource within recent years. To attack this issue, storage resources are designed in major cities such as Beijing, China to distribute water during demand periods. These demand periods vary with uncertainty and thus, the reservoir must be designed to distribute water to a satisfactory probability given all conditions. This model may explore impacts such as the season, pollution level, location of storage, and provide insight for future regulation.[9]

## Conclusion

The chance-constraint method allows users to identify an optimized solution even when there are outstanding unknown risks within the process. The methodology incorporates variable volatility into the optimization constraints to attempt to account for this volatility. The user is able to identify what level of uncertainty is acceptable in the solution, which expands the region of available solutions that could be the optimum. However, when increasing the level of uncertainty, the user is also increasing the possible variation in the final answer due to the random variables. There are many different applications which benefit from the chance-constraint method. One of the most common industrial applications is in logistics planning which can range from power distribution, global transportation networks, and natural resource management.[7][8][9] By recognizing and preparing for unknowns in a problem, the chance-constraint method is a popular optimization process when dealing with uncertainties.

## Sources

1. W. Ackooij, R. Zorgati, R. Henrion, and A. Möller, Chance Constrained Programming and Its Applications to Energy Management, 2011.
2. A. Geletu, M. Klöppel, H. Zhang, and P. Li, "Advances and applications of chance-constrained approaches to systems optimisation under uncertainty," International Journal of Systems Science, vol. 44, pp. 1209-1232, 2013.
3. S. Peng, "Chance constrained problem and its applications," Université Paris Saclay; Xi’an Jiaotong University, 2019
4. W. Ackooij, R. Henrion, A. Möller, and R. Zorgati, "On probabilistic constraints induced by rectangular sets and multivariate normal distributions," Mathematical Methods of Operations Research, vol. 71(3), pp. 535-549, 2010.
5. P. Kall, & S.W. Wallace, Stochastic Programming, pp. 10-20, 1994
6. N.H. Agnew, R.A. Agnew, J. Rasmussen, and K. R. Smith, “An Application of Chance-Constrained Programming to Portfolio Selection in a Casualty Insurance Firm,” Management Science, vol. 15, no. 10, pp. 512-520, 1969.
7. Y. Hu, Y. Li, M. Xu, L. Zhou, and M. Cui, "A Chance-Constrained Economic Dispatch Model in Wind-Thermal-Energy Storage System," Energies, vol. 10(3), 2017.
8. Y. Zhao, Q. Xue, Z. Cao, and X. Zhang, "A Two-Stage Chance Constrained Approach with Application to Stochastic Intermodal Service Network Design Problems," Journal of Advanced Transportation, 2018.
9. W. Li, K. Jiao, Z. Bao, Y. Xie, J. Zhen, G. Huang, and L. Fu, "Chance-constrained dynamic programming for multiple water resources allocation management associated with risk-aversion analysis: A case study of Beijing, China," Water, vol. 9(8):596, 2017.