Sensitivity Analysis in Python

Learn Sensitivity Analysis using Python and why it is important for Decision Makers to interpret the model.

In today’s world, creating models is not enough we also need to explain the models on different-different aspects. Model Interpretability is a more crucial and curious topic for data researchers and analysts to understand the model and get the most out of that model. Sometimes, it delivers new and unique data patterns for decision-making.

Sensitivity analysis is a method to explore the impact of feature changes on the LP model. In this method, we will change one feature and keep others to constant, and check the impact on model output. The main goal of Sensitivity analysis is to observe the effects of feature changes on the optimal solutions for the LP model. It can provide additional insights or information for the optimal solutions to an LP model.

We can perform Sensitivity Analysis in 3 ways:

  • A change in the value of Objective function coefficients
  • A change in the right-hand-side value ofa constant.
  • A change in a coefficient of constant.

In this tutorial, we are going to cover the following topics:

Shadow Price and Slack Variable

The shadow price is the change in the optimal value of the objective function per unit increase in the right-hand side (RHS) for a constraint and everything else remain unchanged.

The slack variable is an amount of a resource that is unused. Slack variable indicates the range of feasibility for that constraint. If slack = 0 then constraints is a binding constraint. Changing the binding constraint changes the solution. Non-binding constraint means any change within this range will not have an effect on the optimal value of the objective function.

Modeling Linear Programming Using Python PuLP

There are various excellent optimization python packages are available such as SciPy, PuLP, Gurobi, and CPLEX. In this article, we will focus on the PuLP python library. PuLP is a general-purpose and open-source Linear Programming modeling package in python.

Install pulp package:

 !pip install pulp

PuLP modeling process has the following steps for solving LP problems:

  • Initialize Model
  • Define Decision Variable
  • Define Objective Function
  • Define the Constraints
  • Solve Model

Problem Statement

This problem is taken from Introduction to Management Science By Stevenson and Ozgur.

A glass manufacturing company produces two types of glass products A and B.

A= Quantity of type A glass

B= Quantity of type B glass

Objective Function: Profit = 60 * A + 50 * B

Constraints:

  • Constraint 1: 4 * A + 10 * B <= 100
  • Constraint 2: 2 * A + 1 * B <= 22
  • Constraint 3: 3 * A + 3 * B <= 39
  • A, B >= 0

Initialize Model

In this step, we will import all the classes and functions of pulp module and create a Maxzimization LP problem using LpProblem class.

from pulp import *

# Initialize Class, Define Vars., and Objective
model = LpProblem("Glass_Manufacturing_Industries_Profits",LpMaximize)

Define Decision Variable

In this step, we will define the decision variables. In our problem, we have three variables wood tables, chairs, and bookcases. Let’s create them using LpVariable class. LpVariable will take the following four values:

  • First, arbitrary name of what this variable represents.
  • Second is the lower bound on this variable.
  • Third is the upper bound.
  • Fourth is essentially the type of data (discrete or continuous). The options for the fourth parameter are LpContinuous or LpInteger.
# Define variables
A = LpVariable('A', lowBound=0)
B = LpVariable('B', lowBound=0)

Define Objective Function

In this step, we will define the maximum objective function by adding it to the LpProblem object.

# Define Objetive Function: Profit on Product A and B
model += 60 * A + 50 * B 

Define the Constraints

In this step, we will add the 4 constraints defined in the problem by adding them to the LpProblem object.

# Constraint 1
model += 4 * A + 10 * B <= 100

# Constraint 2
model += 2 * A + 1 * B <= 22

# Constraint 3
model += 3 * A + 3 * B <= 39

Solve Model

In this step, we will solve the LP problem by calling solve() method. We can print the final value by using the following for loop.

# Solve Model
model.solve()

print("Model Status:{}".format(LpStatus[model.status]))
print("Objective = ", value(model.objective))

for v in model.variables():
    print(v.name,"=", v.varValue)
Output:
Model Status:Optimal
Objective =  740.0
A = 9.0
B = 4.0

Sensitivity Analysis: Compute Shadow Price and Slack Variables

o = [{'name':name,'shadow price':c.pi,'slack': c.slack} for name, c in model.constraints.items()]

print(pd.DataFrame(o))
Output:
name  shadow price  slack
0  _C1     -0.000000   24.0
1  _C2     10.000000   -0.0
2  _C3     13.333333   -0.0

You can see the shadow value for Constraints C2 and C3 is 10 and 13.33. it means if we do a unit change in RHS of constraints C2 and C3 will affect the objective function by 10 and 13.33. Let’s see this in the example of the next section.

Slack value for C1 constraint is 24 which indicates the range of feasibility for that constraint. It also indicates that the constraint is a binding constraint. Constraint C2 and C3 have slack = 0. It means constraint is a non-binding constraint. Nonbinding constraint means any change within this range will not have an effect on the optimal value of the objective function. We can check this by changing the value of constraint C1. Let’s see the following example:

# Initialize Class, Define Vars., and Objective
model = LpProblem("Glass_Manufacturing_Industries_Profits",LpMaximize)

# Define variables
A = LpVariable('A', lowBound=0)
B = LpVariable('B', lowBound=0)

# Define Objetive Function: Profit on Product A and B
model += 60 * A + 50 * B 

# Constraint 1
model += 4 * A + 10 * B <= 76 # update its value

# Constraint 2
model += 2 * A + 1 * B <= 22

# Constraint 3
model += 3 * A + 3 * B <= 39

# Solve Model
model.solve()

print("Model Status:{}".format(LpStatus[model.status]))
print("Objective = ", value(model.objective))

for v in model.variables():
    print(v.name,"=", v.varValue)
    
o = [{'Name':name,'Constraint':c,'shadow price':c.pi,'slack': c.slack} for name, c in model.constraints.items()]

print(pd.DataFrame(o))

Output:
Model Status:Optimal
Objective =  740.0
A = 9.0
B = 4.0
  Name     Constraint  shadow price  slack
0  _C1  {A: 4, B: 10}           2.5   -0.0
1  _C2   {A: 2, B: 1}          25.0   -0.0
2  _C3   {A: 3, B: 3}          -0.0   -0.0

Sensitivity Analysis: Understand the Shadow value

As you have seen the shadow value for Constraints C2 and C3 are 10 and 13.33. it means if we do a unit change in RHS of constraints C2 and C3 will affect the objective function by 10 and 13.33. Let’s see this in the following example. Here we are changing the RHS value of constraint C2:

# Initialize Class, Define Vars., and Objective
model = LpProblem("Glass_Manufacturing_Industries_Profits",LpMaximize)

# Define variables
A = LpVariable('A', lowBound=0)
B = LpVariable('B', lowBound=0)

# Define Objetive Function: Profit on Product A and B
model += 60 * A + 50 * B 

# Constraint 1
model += 4 * A + 10 * B <= 100

# Constraint 2
model += 2 * A + 1 * B <= 23 # increase by 1

# Constraint 3
model += 3 * A + 3 * B <= 39

# Solve Model
model.solve()

print("Model Status:{}".format(LpStatus[model.status]))
print("Objective = ", value(model.objective))

for v in model.variables():
    print(v.name,"=", v.varValue)
    
o = [{'Name':name,'Constraint':c,'shadow price':c.pi,'slack': c.slack} for name, c in model.constraints.items()]

print(pd.DataFrame(o))
Output:
Model Status:Optimal
Objective =  750.0
A = 10.0
B = 3.0
  Name     Constraint  shadow price  slack
0  _C1  {A: 4, B: 10}     -0.000000   30.0
1  _C2   {A: 2, B: 1}     10.000000   -0.0
2  _C3   {A: 3, B: 3}     13.333333   -0.0

As you can see we have updated the RHS value of constraint C2 by 1 unit and the objective function got increased by 10 units. that’s what shadow value represents here.

Summary

Congratulations, you have made it to the end of this tutorial!

In this article, we have learned Sensitivity analysis in LP modeling, Model Interpretability, Shadow value, and slack variable with the examples in the python PuLp library. We have solved the Linear programming problem using PuLP and focused on sensitivity analysis with practical demonstration. Of course, this is just a very basic example of sensitivity analysis. In upcoming articles, we will write more on different optimization problems with sensitivity analysis. You can revise the basics of mathematical concepts in this article and learn about Linear Programming using PuLP in this article. I have written more articles on different optimization problems such as transshipment problems, assignment problems, blending problems.

Avinash Navlani

Recent Posts

MapReduce Algorithm

In this tutorial, we will focus on MapReduce Algorithm, its working, example, Word Count Problem,…

8 months ago

Linear Programming using Pyomo

Learn how to use Pyomo Packare to solve linear programming problems. In recent years, with…

1 year ago

Networking and Professional Development for Machine Learning Careers in the USA

In today's rapidly evolving technological landscape, machine learning has emerged as a transformative discipline, revolutionizing…

1 year ago

Predicting Employee Churn in Python

Analyze employee churn, Why employees are leaving the company, and How to predict, who will…

2 years ago

Airflow Operators

Airflow operators are core components of any workflow defined in airflow. The operator represents a…

2 years ago

MLOps Tutorial

Machine Learning Operations (MLOps) is a multi-disciplinary field that combines machine learning and software development…

2 years ago