Hello everyone and welcome!
In this post, I’m going to show you Python code of my Multi-Objective Hybrid Genetic Algorithm. This is also called Hybrid Non-Dominated Sorting Genetic Algorithm (Hybrid NSGA-II). This is a new and improved version of NSGA-II.
For more details about the famous NSGA-II, please have a look at this paper:
We develop Hybrid NSGA-II by adding a local search algorithm to NSGA-II [1] to improve the solution quality.
Here are the details of the benchmark function to test the performance of the developed hybrid NSGA-II. If you want to download the Python code of hybrid NSGA-II, check the link below. If you have any question or request, please let me know by leaving a comment.
Let’s see how my Hybrid NSGA-II works:
For more videos like this, check my YouTube channel here.
import random as rn
import numpy as np
import matplotlib.pyplot as plt
import math
#_____________________________________________________________________________
def random_population(nv,n,lb,ub):
# nv = numver of variables
# n = number of random solutions
# lb = lower bound
# ub = upper bound
pop=np.zeros((n, nv))
for i in range(n):
pop[i,:] = np.random.uniform(lb,ub)
return pop
#_____________________________________________________________________________
def crossover(pop, crossover_rate):
offspring = np.zeros((crossover_rate, pop.shape[1]))
for i in range(int(crossover_rate/2)):
r1=np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
while r1 == r2:
r1 = np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
cutting_point = np.random.randint(1, pop.shape[1])
offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]
return offspring
#_____________________________________________________________________________
def mutation(pop, mutation_rate):
offspring = np.zeros((mutation_rate, pop.shape[1]))
for i in range(int(mutation_rate/2)):
r1=np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
while r1 == r2:
r1 = np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
cutting_point = np.random.randint(0, pop.shape[1])
offspring[2*i] = pop[r1]
offspring[2*i,cutting_point] = pop[r2,cutting_point]
offspring[2*i+1] = pop[r2]
offspring[2*i+1, cutting_point] = pop[r1, cutting_point]
return offspring
#_____________________________________________________________________________
def local_search(pop, n, step_size):
# number of offspring chromosomes generated from the local search
offspring = np.zeros((n, pop.shape[1]))
for i in range(n):
r1=np.random.randint(0,pop.shape[0])
chromosome = pop[r1,:]
...
offspring[i,:] = chromosome
return offspring
#_____________________________________________________________________________
def evaluation(pop):
fitness_values = np.zeros((pop.shape[0], 2)) # because of 2 objective functions
for i, chromosome in enumerate(pop):
n = 3 # use the instance with n = 3
for j in range(2):
if j == 0: # objective 1
fitness_values[i,j] = 1 - math.exp(-sum((chromosome - 1/math.sqrt(n))**2))
elif j == 1: # objective 2
fitness_values[i,j] = 1 - math.exp(-sum((chromosome + 1/math.sqrt(n))**2))
return fitness_values
#_____________________________________________________________________________
def crowding_calculation(fitness_values):
pop_size = len(fitness_values[:, 0])
fitness_value_number = len(fitness_values[0, :])
matrix_for_crowding = np.zeros((pop_size, fitness_value_number))
...
crowding_distance = np.sum(matrix_for_crowding, axis=1) # crowding distance of each solution
return crowding_distance
#_____________________________________________________________________________
def remove_using_crowding(fitness_values, number_solutions_needed):
pop_index = np.arange(fitness_values.shape[0])
crowding_distance = crowding_calculation(fitness_values)
selected_pop_index = np.zeros((number_solutions_needed))
selected_fitness_values = np.zeros((number_solutions_needed, len(fitness_values[0, :])))
for i in range(number_solutions_needed):
pop_size = pop_index.shape[0]
solution_1 = rn.randint(0, pop_size - 1)
...
selected_pop_index = np.asarray(selected_pop_index, dtype=int) # Convert the data to integer
return (selected_pop_index)
#_____________________________________________________________________________
def pareto_front_finding(fitness_values, pop_index):
pop_size = fitness_values.shape[0]
pareto_front = np.ones(pop_size, dtype=bool) # initially assume all solutions are in pareto front by using "1"
for i in range(pop_size):
for j in range(pop_size):
...
#_____________________________________________________________________________
def selection(pop, fitness_values, pop_size):
pop_index_0 = np.arange(pop.shape[0])
pop_index = np.arange(pop.shape[0])
pareto_front_index = []
while len(pareto_front_index) < pop_size:
...
#_____________________________________________________________________________
# Parameters
nv = 3
lb = [-4, -4, -4]
ub = [4, 4, 4]
pop_size = 100
rate_crossover = 30
rate_mutation = 20
rate_local_search = 30
step_size = 0.1
pop = random_population(nv,pop_size,lb,ub)
#_____________________________________________________________________________
# Main loop of NSGA II
for i in range(150):
offspring_from_crossover = crossover(pop,rate_crossover)
offspring_from_mutation = mutation(pop,rate_mutation)
offspring_from_local_search = local_search(pop, rate_local_search, step_size)
...
#_____________________________________________________________________________
# Pareto front visualization
fitness_values = evaluation(pop)
index = np.arange(pop.shape[0]).astype(int)
pareto_front_index = pareto_front_finding(fitness_values, index)
pop = pop[pareto_front_index, :]
print("_________________")
print("Optimal solutions:")
print(" x1 x2 x3")
print(pop) # show optimal solutions
fitness_values = fitness_values[pareto_front_index]
print("______________")
print("Fitness values:")
print("objective 1 objective 2")
print(" | |")
print(fitness_values)
plt.scatter(fitness_values[:, 0],fitness_values[:, 1])
plt.xlabel('Objective function 1')
plt.ylabel('Objective function 2')
plt.show()
.
.
.
Sorry! This is only a half of the code.
Notice: It would take you from 1 to 3 hours to re-type the Python code yourself; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.
Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)
Download the whole Python code here (Membership Code ID: 020)
No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.
If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com
Thank you very much and good luck with your research!
Respected Sir,
Can we apply a similar code for NSGA-ii (multi-objective)? If yes, please can you tell me which points need to change.
Hi, yes. To solve new problems, we need to update the objective function and constraints. the rest can be kept the same.
Thank you, Sir, for your kind response.
Respected Sir,
I have two objective function, first objective is find the maximize second objective is find the minimize.
which part I need to change in the code?
If it can change, please tell me which points need to change.
thank you so much!!
Hi, for that problem, we need to change the “evaluation” and “pareto_front_finding”. If you have any difficulty, just let me know, I will make a video about this problem next week. Thanks
Respected Sir, I have two objective function, first objective is find the maximize second objective is find the minimize. which part I need to change in the code? If it can change, please tell me which points need to change. thank you so much!!
we must change the “pareto_front_finding” and “evaluation” at least, maybe more. I will make a video about solving problem with 1 min objective and 1 max objective. Thanks for the suggestion!
Thank you so much !
I already modify the evaluation and I already have my own fitness.
So I have problem to find the pareto front.
We should update and then test one function at a time.
do i need to change the “selection” part?
No, I don’t think so.
Thank you so much, Sir.
Respected Sir,
I need to apply this to a MRP (material requirements planning) problem. Is it possible?
Hi, yes, it is possible
The code you provided is mainly about numerical optimization. If we want to solve a multi-objective problem related to spatial location, in which parts would we make and change?
We need to update the objective function and constraints. The rest can be kept the same. Thanks
May I ask why the code you provided does not have parameters such as objective function, constraints and decision variables?
It has the objective function and decision variables but it does not have constraints because the problem is unconstrained problem. The code works exactly the same as shown in the video
Respected Sir,
I need to apply this to three objective function, can i change it? if possible, which part I need to change in the code?
It is possible to customize the code to solve a problem with three objective but it is quite long. Can’t explain here. I will make that video as soon as possible.
Dear Sir,
When would you find time to explain the code with three objective functions? I would be very grateful!
Many thanks for your suggestion!
Dear Sir,
Did you already make a video for multi-objective optimization one to minimize and the second to maximize the function? i was trying to find a video but I did not find it. please let me know thanks.
best,
Hafiz
Many thanks for your interesting suggestion! I will make that video
We are still waiting sir for you to make a 3 objective video where 2 objectives need to be maximized and one minimized.
Respected Sir ,
how can i join you to get the half of the code
Thank you
Follow the link at the end of the post
Hello Sir,
how can i join you to get the half of the code
You can get the full code by clicking the link at the end.
Sir, Can you please make similar a multi-objective NSGA ii video with a constrained problem. As the video provided is suitable only for the unconstrained problem.
Many thanks for your suggestion!
Hi , i need help with a algorithm in phyton , include nsga ii and indicator based in estimators , i am interested that you could help me , and i could pay for your help
Many thanks for your interest !
Did you get in touch with the owner of the code for help? I would like to know in order to get in touch.
Hello, I am the owner. How can I help?
Hi Son,
Glad to see you here, and congratulations on your professional works.
Is this code could be applicable for binary variables?
Thanks
Many thanks Mohadese!
sir is this code works for 8 input parameters and one output parameter
2 output parameters (i.e., 2 objective functions)
If it works for 9 parameters, in which part of the code, we have to do changes?
We must change the objective function, chromosome, etc.
Can we use it for prediction?
Yes, it can be used
How can we use Traveling salesman problem
Please have a look at my codes for TSP on this website
I would like to do for initialization of list of s1,s2,s3,s4 with some elements .. for genetic Algorithm with tsp
Please have a look at my GA codes for TSP. This code is only for multi-objective optimization.
I need suggestion: How to combine random solutions with crossover for genetic algorithm of TSP.
To handle the constraint in TSP, we need to design a special crossover for TSP. For more details, please have a look my GA code for TSP on this website.
I want some help in understanding how to write an objective function when my mathematical model is ready. Is there something I could refer to?
We should to the problem formulation first, i.e., math functions of the objective functions and constraints
I would like help in understanding how to go about writing the objective function I have already drafted. Is there a possibility to get in touch with you?
Hello, objective functions are the math functions that we want to max or min
Hello…If I have a dataset that has 6 input variables and 2 outputs (for dual objectives), how do I configure this code? Also if my objective function is a predictive analytics model (say XGBoost1 and XGBoost2), how do I refer the same in the code? Can you please help? Many thanks in advance.
I think regression is needed to solve your problem
hello sir,
how do we modify the code to solve single objective, multi variable problem?
Please have look at my single objective GA
Any other method like by card to pay except Paypal?
Credit card could be used
I need suggestion please : I want to apply this algorithm to perform a segmentation of the images, where the output is the best order for the operations of Dilation and Erosion I really need advice.
Yes, it could be used for that
I want to apply multi objective genetic algorithm for missing data imputation, How can I do that? I really need some help here.
Thank you for comments
Dear sir,
Is it possible to have multiple input variables (more than 10) and a multipurpose function (more than 3)?
yes, it is
Dear Sir,
is it possible to have multi-input variables (more than 10) and multi-objective functions (more than 3)?
Yes, it is possible
How is it different from ‘Hybrid Non-Dominated Sorting Genetic Algorithm (Hybrid NSGA-II) in Python(December 13, 2020)’?
This is my modified version
Hi, Have you modified the code for Min and max objectives and constraints? Because I couldn’t find your videos for the problem with two objectives(One with min and the other with max) and constraints. Thank you
Thanks for the good question. I will try to make a video about that as soon as possible.
Hi, I need this codes but I dont have any paypal account. Also Paypal is restricted in my country is there any way to buy or send you money to get codes ? thank you
Can you pay by USDT? If yes, please let me know.
Yes I can. Please help me about wallet information.
Hello, what is the name of the technique that you are using for local search?
Hello, I used hill climbing algorithm.
I paid for the code but can’t find the download button and when I click the link it just takes me back….
Please check your email.
I also have the same problem.
Please check your email
I have paid. but cannot download
Please contact me by email!
Same issue
Hi! Great work! Is it works for 1 obj function? I have obj function like sum(model – fact)^2, and many params in model which i want to optimize to fit with fact.
No, for single objective, please see other posts on this web
Respected Sir, I have two objective function, first objective is find the maximize second objective is find the minimize.
Can you make a video for such problems
Thank you for your suggestions!