Solving Optimization Problems

Python Code of Particle Swarm Optimization (PSO) Algorithm

Hello everyone. In this video, I’m going to show you a Python code of Particle Swarm Optimization (PSO) algorithm and test its performance in solving 2 simple optimization problems (one is maximization problem and another one is minimization problem).

In addition, I will show you how to customize this Python code of PSO to solve other optimization problems. If you have any questions, please leave your comments below. I will try to answer your questions as soon as possible.

Let’s get started.

For more videos like this, check my YouTube channel here.

import random
import math
import matplotlib.pyplot as plt
#------------------------------------------------------------------------------
# TO CUSTOMIZE THIS PSO CODE TO SOLVE UNCONSTRAINED OPTIMIZATION PROBLEMS, CHANGE THE PARAMETERS IN THIS SECTION ONLY:
# THE FOLLOWING PARAMETERS MUST BE CHANGED.
def objective_function(x):
    y = 3*(1-x[0])**2*math.exp(-x[0]**2 - (x[1]+1)**2) - 10*(x[0]/5 - x[0]**3 - x[1]**5)*math.exp(-x[0]**2 - x[1]**2) -1/3*math.exp(-(x[0]+1)**2 - x[1]**2);
    return y
 
bounds=[(-3,3),(-3,3)]   # upper and lower bounds of variables
nv = 2                   # number of variables
mm = -1                   # if minimization problem, mm = -1; if maximization problem, mm = 1
 
# THE FOLLOWING PARAMETERS ARE OPTINAL.
particle_size=100         # number of particles
iterations=200           # max number of iterations
w=0.85                    # inertia constant
c1=1                    # cognative constant
c2=2                     # social constant
# END OF THE CUSTOMIZATION SECTION
#------------------------------------------------------------------------------    
class Particle:
    def __init__(self,bounds):
        self.particle_position=[]                     # particle position
        self.particle_velocity=[]                     # particle velocity
        self.local_best_particle_position=[]          # best position of the particle
.
.
.
    def evaluate(self,objective_function):
        self.fitness_particle_position=objective_function(self.particle_position)
        if mm == -1:
            if self.fitness_particle_position < self.fitness_local_best_particle_position:
                self.local_best_particle_position=self.particle_position                  # update the local best
                self.fitness_local_best_particle_position=self.fitness_particle_position  # update the fitness of the local best
.
.
. 
    def update_velocity(self,global_best_particle_position):
        for i in range(nv):
            r1=random.random()
            r2=random.random()
 
            cognitive_velocity = c1*r1*(self.local_best_particle_position[i] - self.particle_position[i])
            social_velocity = c2*r2*(global_best_particle_position[i] - self.particle_position[i])
            self.particle_velocity[i] = w*self.particle_velocity[i]+ cognitive_velocity + social_velocity
 
    def update_position(self,bounds):
        for i in range(nv):
            self.particle_position[i]=self.particle_position[i]+self.particle_velocity[i]
 
            # check and repair to satisfy the upper bounds
            if self.particle_position[i]>bounds[i][1]:
                self.particle_position[i]=bounds[i][1]
.
.
.                 
class PSO():
    def __init__(self,objective_function,bounds,particle_size,iterations):
 
        fitness_global_best_particle_position=initial_fitness
        global_best_particle_position=[]
 
        swarm_particle=[]
        for i in range(particle_size):
            swarm_particle.append(Particle(bounds))
        A=[]
         
        for i in range(iterations):
            for j in range(particle_size):
                swarm_particle[j].evaluate(objective_function)
.
.
.
Sorry! This is only a half of the code.

Notice: It’s possible to watch the video and re-type the Python code yourself – that would take you from 1 to 3 hours; or with just €1.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.

Original price is €4.99 but today it’s only €1.99 (save €3 today – available for a limited time only)

Download the whole Python code here (Membership Code ID: 005)

No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.

If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com

Thank you very much and good luck with your research!

Dr.Panda

View Comments

    • Hi, record the fitness value of the global best particle, check it in every iteration to see if it is improving or not. Set the number of iterations, in which the best fitness is not improving, to stop the algorithm. Use the function ''break'' to jump out of the loop to stop your PSO. Good luck!

  • Hello Sir,
    I want to use this code to solve the problem of optimal capacitor sizing and placement
    Can you please help me with the editing?

    • Hi, thanks for interest in my PSO code. To solve a new problem, we must update the objective function and constraints.

        • Hi, currently, I give general advice for FREE. I don't have time to solve research problems for other people. I don't accept money from anyone. Thanks for your interest!

      • Can you please help me with that?
        Would pay for your services..
        My work is a constrained optimization problem of optimal sizing and placement of capacitors

        • Hi, currently, I give general advice for FREE. I don't have time to solve research problems for other people. I don't accept money from anyone. Thanks for your interest!

          • ok sir, please then i need your advice in editing and updating the code to solve optimal capacitor size and placement problems

          • Hi, to solve a new problem, we need to update objective function and constraints. That's all. The rest of the PSO code can be the same.

  • Hello. Just wanna clarify something. Does the generated random number for the acceleration coefficients r1 and r2 in line 48 and 49 stay the same throughout the optimization process or does it draw a random number at every iteration step?

    • Hi, it is possible for the PSO code to work on image processing but we have to modify the objective function. "imread" can be used to load the images as far as I can remember.

  • Hello sir,
    what does the number of variables (nV) in the code means.The result of the optimal solution is being displayed as a range what does that infer.

    • the number of variables is the number of decision variables or the variables you want to find the optimal values. for example, z = x + y, this problem has 2 variables, x and y

  • Hello,
    Is it possible to bulit a confusion matrix with this code .How do we need to define the y_pred values ...

  • Good day sir,
    Thanks for this really helpful code. However could you please explain what mm means in this code? I have read the comment out on it and I still don't quite understand what it is for.
    Any help would be appreciated.

    • Hi, mm is my notation for minimization or maximization problems (if minimization problem, mm = -1; if maximization problem, mm = 1). This code can solve both problems

  • Hello,
    The code is great. Although how would you add code to add another grid showing the final placement of the particles at the end of the optimization.

    • Yes, it is possible to do that because we know the positions of the particles at the end of the optimization, i.e. final iteration.

  • Good day,
    I just want to ask why you had to add [0] or [1] beside x when defining the function,
    i.e. y = 3*(1-x[0])**2*math.exp(-x[0]**2 - (x[1]+1)**2) - 10*(x[0]/5 - x[0]**3 - x[1]**5)*math.exp(-x[0]**2 - x[1]**2) -1/3*math.exp(-(x[0]+1)**2 - x[1]**2)
    What is the purpose of those and how do I know if I need to put a number 1 or 0 beside x?

    • Hi, they are indexes to determine the elements of the vector x. For example, x[0] is the first element in x, x[1] is the second element in x. Hope it's clear.

Recent Posts

Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone! In this post, I am going to show you my innovative version of…

8 months ago

Test Your Understanding About Genetic Algorithm (Test 2)

Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…

8 months ago

Adaptive Restart Hybrid Genetic Algorithm

Hello everyone! In this post, I am going to show you my innovative version of…

8 months ago

Adaptive Re-Start Hybrid Genetic Algorithm (Test the Performance in Case Studies)

Hello everyone. In this post, I am going to show you my innovative version of…

1 year ago

Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…

1 year ago

Crypto Quiz (Test Your Knowledge About Cryptocurrency)

Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…

2 years ago