Heuristic optimization in Python

Following the previous articles on interfaces (+) and (exact) solvers (+) for optimization in Python, in this article, I introduce some packages that provide an easy-to-use “interface” for artificially intelligent algorithms (AIAs) (e.g., heuristics, meta-heuristics, math-heuristics, learn-heuristics, hyper-heuristics, or sim-heuristics).

AIAs may not reach globally optimal solutions, which means that the user should always be aware of the reason for choosing them based on a use case and try to tune their parameters or increase their adaptiveness to the landscape of an optimization problem. However, contrary to the (exact) solvers introduced, they can obtain solutions in a reasonably shorter amount of time. Besides, AIAs may not derive feasible solutions based on the characteristics of an optimization model (and the programming expertise of the user). The significant advantage of using an AIA is that no matter the features of the optimization problem (e.g., LP, MIP, MILP, or MINLP), an AIA can be a general problem solver (GPS) and solve all of them.

Notably, this is not a complete survey of what is available, but I will try to complete it over time. Moreover, I implement either GA or PSO via the packages introduced, two of the most famous AIAs introduced in 1975 and 1995. Hence, the introduced packages should also support them. Finally, since the optimization model is presented here (+) and has constraints, I model them as penalties in the objective function.

1. GeneticAlgorithm for optimization in Python

  • Supported algorithms: GA.
  • Supporting multiple objectives: No.

Here, I provide a coding example using “geneticalgorithm” in Python:

# Installation (Uncomment the line below)
#!pip install geneticalgorithm

import numpy as np
from geneticalgorithm import geneticalgorithm as ga

# Define charateristics of variables:
varbound=np.array([[0,1],[0,1]])
vartype=np.array([['real'],['real']])

# Define settings of the algorithm:
algorithm_param = {'max_num_iteration': 100,\
                   'population_size':60,\
                   'mutation_probability':0.1,\
                   'elit_ratio': 0.01,\
                   'crossover_probability': 0.5,\
                   'parents_portion': 0.3,\
                   'crossover_type':'uniform',\
                   'max_iteration_without_improv':None}  
    
# Define your optimization model:
def MyOptProb(X):
    y = 0+X[1]*(1.29-0)
    x = np.round(0+X[0]*(2-0))
    g1 = 5/10 * x + 3/10 * y - 1
    g2 = 2/9 * x + 7/9 * y - 1
    penalty = np.amax(np.array([0,g1,g2]))
    return -(2*x+5*y)+150*penalty**2

# Define a solution retriever:
def Results(obj, variables):
    x = round(0+variables[0]*(2-0))
    y = 0+variables[0]*(1.29-0)
    g1 = 5 * x + 3 * y - 10
    g2 = 2 * x + 7 * y - 9
    print(g1)
    print(g2)
    if g1>10e-6 or g2>10e-6:
      print("infeasible")
    else:
      print("feasible")
    print("x:",x)
    print("y:",y)
    print("obj:",2*x+5*y)

# Model and solve the problem:
model=ga(function=MyOptProb,dimension=2,variable_type_mixed=vartype,variable_boundaries=varbound,algorithm_parameters=algorithm_param)
model.run()

# Display results:    
Results(model.best_function,model.best_variable)   

2. PYSWARMS for optimization in Python

  • Supported algorithms: PSO.
  • Supporting multiple objectives: No

Here, I provide a coding example using “pyswarms” in Python:

# Installation (Uncomment the line below)
# !pip install pyswarms

from pyswarms.single.global_best import GlobalBestPSO
import numpy as np

# Define charateristics of variables:
x_min = [0, 0]
x_max = [1, 1]
bounds = (x_min, x_max)
dim = len(x_min)

# Define settings of the algorithm:
pop = 100
iterations = 250
options = {'c1': 0.5, 'c2': 0.4, 'w': 0.9}

# Define your optimization model:
def MyOptProb(x):
    y = 0 + x[:, 1] * (1.29 - 0)
    x = np.round(0 + x[:, 0] * (2 - 0))
    g1 = 5 / 10 * x + 3 / 10 * y - 1
    g2 = 2 / 9 * x + 7 / 9 * y - 1
    penalty = np.amax(np.array([np.zeros(pop), g1, g2]))
    return -(2 * x + 5 * y) + 150 * penalty ** 2

# Define a solution retriever:
def Results(obj, variables):
    x = round(0 + variables[0] * (2 - 0))
    y = 0 + variables[0] * (1.29 - 0)
    g1 = 5 * x + 3 * y - 10
    g2 = 2 * x + 7 * y - 9
    if g1 > 10e-6 or g2 > 10e-6:
        print ('infeasible')
    else:
        print ('feasible')
    print ('x:', x)
    print ('y:', y)
    print ('obj:', 2 * x + 5 * y)

# Model and solve the problem:
optimizer = GlobalBestPSO(n_particles=pop, dimensions=dim,
                          options=options, bounds=bounds)
(obj, variables) = optimizer.optimize(MyOptProb, iterations)

# Display results:
Results(obj, variables)

Click the button below to download the complete pack of the above coding examples:

If this article is going to be used in research or other publishing methods, you can cite it as Tafakkori (2022) (in text) and refer to it as follows: Tafakkori, K. (2022). Artificially intelligent algorithms for optimization in PythonSupply Chain Data Analytics. url: https://www.supplychaindataanalytics.com/artificially-intelligent-algorithms-for-optimization-in-python/

You May Also Like

Leave a Reply

1 comment

Marco says:

Nice post. Check out FST-PSO as an alternative to classic PSO: simpler interface and no hyperparameters to be set. It’s available on GITHUB and PyPI.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.