Linear optimization in Python: Using SciPy for linear programming

In previous posts I showed how to conduct optimization in R (linear optimization with lpSolve, quadratic optimization with quadprog and non-linear gradient descent optimization with nloptr). In this post I show how to model and solve the linear optimization problem below – using SciPy in Python:

In the SciPy-package in Python I can use the linprog function to model and solve this simple linear optimization problem. For that I will state it in vector matrix notation form – and transform it into a minimzation problem:

In Python I can solve this problem as follows:

# set up cost list with cost function coefficient values
c = [-2,-3]

# set up constraint coefficient matrix A
A_ub = [[1,1],
        [2,1]]

# constraint list for upper bounds (less than or equal constraints)
b_ub =[10,15]

# in addition, i need to prepare a bounds tuple for each optimization variable and summarize them a list
x1_bounds = (0,None)
x2_bounds = (0,None)

# now I use SciPy.optimize.linprog to model and solve the problem at hand
from scipy.optimize import linprog
model_linear = linprog(c=c,
                      A_ub=A_ub,
                      b_ub = b_ub,
                      bounds = [x1_bounds,x2_bounds])

# output model solution
print(str(model_linear))
     fun: -30.0
 message: 'Optimization terminated successfully.'
     nit: 1
   slack: array([ 0.,  5.])
  status: 0
 success: True
       x: array([  0.,  10.])

Be aware, since for using linprog we transformed the problem into the form of a minimization problem the optimal objectve function value is not -30.0 but 30.0

Leave a Reply

1 thought on “Linear optimization in Python: Using SciPy for linear programming

Leave a Reply

Your email address will not be published. Required fields are marked *

Close

Meta