Hello everyone and welcome!
In this post, I’m going to show you Python code of my Multi-Objective Hybrid Genetic Algorithm. This is also called Hybrid Non-Dominated Sorting Genetic Algorithm (Hybrid NSGA-II). This is a new and improved version of NSGA-II.
For more details about the famous NSGA-II, please have a look at this paper:
We develop Hybrid NSGA-II by adding a local search algorithm to NSGA-II [1] to improve the solution quality.
Here are the details of the benchmark function to test the performance of the developed hybrid NSGA-II. If you want to download the Python code of hybrid NSGA-II, check the link below. If you have any question or request, please let me know by leaving a comment.
Let’s see how my Hybrid NSGA-II works:
For more videos like this, check my YouTube channel here.
import random as rn
import numpy as np
import matplotlib.pyplot as plt
import math
#_____________________________________________________________________________
def random_population(nv,n,lb,ub):
# nv = numver of variables
# n = number of random solutions
# lb = lower bound
# ub = upper bound
pop=np.zeros((n, nv))
for i in range(n):
pop[i,:] = np.random.uniform(lb,ub)
return pop
#_____________________________________________________________________________
def crossover(pop, crossover_rate):
offspring = np.zeros((crossover_rate, pop.shape[1]))
for i in range(int(crossover_rate/2)):
r1=np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
while r1 == r2:
r1 = np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
cutting_point = np.random.randint(1, pop.shape[1])
offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]
return offspring
#_____________________________________________________________________________
def mutation(pop, mutation_rate):
offspring = np.zeros((mutation_rate, pop.shape[1]))
for i in range(int(mutation_rate/2)):
r1=np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
while r1 == r2:
r1 = np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
cutting_point = np.random.randint(0, pop.shape[1])
offspring[2*i] = pop[r1]
offspring[2*i,cutting_point] = pop[r2,cutting_point]
offspring[2*i+1] = pop[r2]
offspring[2*i+1, cutting_point] = pop[r1, cutting_point]
return offspring
#_____________________________________________________________________________
def local_search(pop, n, step_size):
# number of offspring chromosomes generated from the local search
offspring = np.zeros((n, pop.shape[1]))
for i in range(n):
r1=np.random.randint(0,pop.shape[0])
chromosome = pop[r1,:]
...
offspring[i,:] = chromosome
return offspring
#_____________________________________________________________________________
def evaluation(pop):
fitness_values = np.zeros((pop.shape[0], 2)) # because of 2 objective functions
for i, chromosome in enumerate(pop):
n = 3 # use the instance with n = 3
for j in range(2):
if j == 0: # objective 1
fitness_values[i,j] = 1 - math.exp(-sum((chromosome - 1/math.sqrt(n))**2))
elif j == 1: # objective 2
fitness_values[i,j] = 1 - math.exp(-sum((chromosome + 1/math.sqrt(n))**2))
return fitness_values
#_____________________________________________________________________________
def crowding_calculation(fitness_values):
pop_size = len(fitness_values[:, 0])
fitness_value_number = len(fitness_values[0, :])
matrix_for_crowding = np.zeros((pop_size, fitness_value_number))
...
crowding_distance = np.sum(matrix_for_crowding, axis=1) # crowding distance of each solution
return crowding_distance
#_____________________________________________________________________________
def remove_using_crowding(fitness_values, number_solutions_needed):
pop_index = np.arange(fitness_values.shape[0])
crowding_distance = crowding_calculation(fitness_values)
selected_pop_index = np.zeros((number_solutions_needed))
selected_fitness_values = np.zeros((number_solutions_needed, len(fitness_values[0, :])))
for i in range(number_solutions_needed):
pop_size = pop_index.shape[0]
solution_1 = rn.randint(0, pop_size - 1)
...
selected_pop_index = np.asarray(selected_pop_index, dtype=int) # Convert the data to integer
return (selected_pop_index)
#_____________________________________________________________________________
def pareto_front_finding(fitness_values, pop_index):
pop_size = fitness_values.shape[0]
pareto_front = np.ones(pop_size, dtype=bool) # initially assume all solutions are in pareto front by using "1"
for i in range(pop_size):
for j in range(pop_size):
...
#_____________________________________________________________________________
def selection(pop, fitness_values, pop_size):
pop_index_0 = np.arange(pop.shape[0])
pop_index = np.arange(pop.shape[0])
pareto_front_index = []
while len(pareto_front_index) < pop_size:
...
#_____________________________________________________________________________
# Parameters
nv = 3
lb = [-4, -4, -4]
ub = [4, 4, 4]
pop_size = 100
rate_crossover = 30
rate_mutation = 20
rate_local_search = 30
step_size = 0.1
pop = random_population(nv,pop_size,lb,ub)
#_____________________________________________________________________________
# Main loop of NSGA II
for i in range(150):
offspring_from_crossover = crossover(pop,rate_crossover)
offspring_from_mutation = mutation(pop,rate_mutation)
offspring_from_local_search = local_search(pop, rate_local_search, step_size)
...
#_____________________________________________________________________________
# Pareto front visualization
fitness_values = evaluation(pop)
index = np.arange(pop.shape[0]).astype(int)
pareto_front_index = pareto_front_finding(fitness_values, index)
pop = pop[pareto_front_index, :]
print("_________________")
print("Optimal solutions:")
print(" x1 x2 x3")
print(pop) # show optimal solutions
fitness_values = fitness_values[pareto_front_index]
print("______________")
print("Fitness values:")
print("objective 1 objective 2")
print(" | |")
print(fitness_values)
plt.scatter(fitness_values[:, 0],fitness_values[:, 1])
plt.xlabel('Objective function 1')
plt.ylabel('Objective function 2')
plt.show()
.
.
.
Sorry! This is only a half of the code.
Notice: It would take you from 1 to 3 hours to re-type the Python code yourself; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.
Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)
Download the whole Python code here (Membership Code ID: 020)
No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.
If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com
Thank you very much and good luck with your research!
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. In this post, I am going to show you my innovative version of…
Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…
Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…
View Comments
Respected Sir,
Can we apply a similar code for NSGA-ii (multi-objective)? If yes, please can you tell me which points need to change.
Hi, yes. To solve new problems, we need to update the objective function and constraints. the rest can be kept the same.
Thank you, Sir, for your kind response.
Respected Sir,
I have two objective function, first objective is find the maximize second objective is find the minimize.
which part I need to change in the code?
If it can change, please tell me which points need to change.
thank you so much!!
Hi, for that problem, we need to change the "evaluation" and "pareto_front_finding". If you have any difficulty, just let me know, I will make a video about this problem next week. Thanks
Respected Sir, I have two objective function, first objective is find the maximize second objective is find the minimize. which part I need to change in the code? If it can change, please tell me which points need to change. thank you so much!!
we must change the "pareto_front_finding" and "evaluation" at least, maybe more. I will make a video about solving problem with 1 min objective and 1 max objective. Thanks for the suggestion!
Thank you so much !
I already modify the evaluation and I already have my own fitness.
So I have problem to find the pareto front.
We should update and then test one function at a time.
do i need to change the "selection" part?
No, I don't think so.
Thank you so much, Sir.
Respected Sir,
I need to apply this to a MRP (material requirements planning) problem. Is it possible?
Hi, yes, it is possible
The code you provided is mainly about numerical optimization. If we want to solve a multi-objective problem related to spatial location, in which parts would we make and change?
We need to update the objective function and constraints. The rest can be kept the same. Thanks
May I ask why the code you provided does not have parameters such as objective function, constraints and decision variables?
It has the objective function and decision variables but it does not have constraints because the problem is unconstrained problem. The code works exactly the same as shown in the video
Respected Sir,
I need to apply this to three objective function, can i change it? if possible, which part I need to change in the code?
It is possible to customize the code to solve a problem with three objective but it is quite long. Can't explain here. I will make that video as soon as possible.
Dear Sir,
When would you find time to explain the code with three objective functions? I would be very grateful!
Many thanks for your suggestion!
Dear Sir,
Did you already make a video for multi-objective optimization one to minimize and the second to maximize the function? i was trying to find a video but I did not find it. please let me know thanks.
best,
Hafiz
Many thanks for your interesting suggestion! I will make that video
We are still waiting sir for you to make a 3 objective video where 2 objectives need to be maximized and one minimized.
Respected Sir ,
how can i join you to get the half of the code
Thank you
Follow the link at the end of the post
Hello Sir,
how can i join you to get the half of the code
You can get the full code by clicking the link at the end.
Sir, Can you please make similar a multi-objective NSGA ii video with a constrained problem. As the video provided is suitable only for the unconstrained problem.
Many thanks for your suggestion!