Hello everyone and welcome!
In this post, I’m going to show you my Hybrid Non-Dominated Sorting Genetic Algorithm (or Hybrid NSGA-II) in Python and demonstrate how it works in solving an optimization problem with 2 objectives and 3 variables. This is an improved version of multi objective genetic algorithm in which a local search algorithm is added to the famous Non-Dominated Sorting Genetic Algorithm (or NSGA-II) to enhance the solution quality.
For more information about the NSGA-II, please have a look at this famous paper.
It should be noted that my hybrid NSGA-II is developed by adding a local search algorithm to NSGA-II to improve the solution quality.
Here are the details of the benchmark function to test the performance of the developed hybrid NSGA-II. If you want to download the Python code of my hybrid NSGA-II, check the link below. If you have any question, please let me know by leaving a comment.
Let’s see how my Hybrid NonDominated Sorting Genetic Algorithm in Python works.
For more videos like this, check my YouTube channel here.
import random as rn
import numpy as np
import matplotlib.pyplot as plt
import math
#_____________________________________________________________________________
def random_population(nv,n,lb,ub):
# nv = numver of variables
# n = number of random solutions
# lb = lower bound
# ub = upper bound
pop=np.zeros((n, nv))
for i in range(n):
pop[i,:] = np.random.uniform(lb,ub)
return pop
#_____________________________________________________________________________
def crossover(pop, crossover_rate):
offspring = np.zeros((crossover_rate, pop.shape[1]))
for i in range(int(crossover_rate/2)):
r1=np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
while r1 == r2:
r1 = np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
cutting_point = np.random.randint(1, pop.shape[1])
offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]
return offspring
#_____________________________________________________________________________
def mutation(pop, mutation_rate):
offspring = np.zeros((mutation_rate, pop.shape[1]))
for i in range(int(mutation_rate/2)):
r1=np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
while r1 == r2:
r1 = np.random.randint(0, pop.shape[0])
r2 = np.random.randint(0, pop.shape[0])
cutting_point = np.random.randint(0, pop.shape[1])
offspring[2*i] = pop[r1]
offspring[2*i,cutting_point] = pop[r2,cutting_point]
offspring[2*i+1] = pop[r2]
offspring[2*i+1, cutting_point] = pop[r1, cutting_point]
return offspring
#_____________________________________________________________________________
def local_search(pop, n, step_size):
# number of offspring chromosomes generated from the local search
offspring = np.zeros((n, pop.shape[1]))
for i in range(n):
...
offspring[i,:] = chromosome
return offspring
#_____________________________________________________________________________
def evaluation(pop):
fitness_values = np.zeros((pop.shape[0], 2)) # because of 2 objective functions
for i, x in enumerate(pop):
obj1 = 0
for j in range(2):
obj1 += - 10*math.exp(-0.2*math.sqrt((x[j])**2 + (x[j+1])**2))
obj2 = 0
for j in range(3):
obj2 += (abs(x[j]))**0.8 + 5*math.sin((x[j])**3)
fitness_values[i,0] = obj1 # objective 1
fitness_values[i,1] = obj2 # objective 2
return fitness_values
#_____________________________________________________________________________
def crowding_calculation(fitness_values):
pop_size = len(fitness_values[:, 0])
fitness_value_number = len(fitness_values[0, :])
matrix_for_crowding = np.zeros((pop_size, fitness_value_number))
...
for i in range(fitness_value_number):
crowding_results = np.zeros(pop_size)
...
return crowding_distance
#_____________________________________________________________________________
def remove_using_crowding(fitness_values, number_solutions_needed):
pop_index = np.arange(fitness_values.shape[0])
crowding_distance = crowding_calculation(fitness_values)
selected_pop_index = np.zeros((number_solutions_needed))
...
for i in range(number_solutions_needed):
pop_size = pop_index.shape[0]
solution_1 = rn.randint(0, pop_size - 1)
...
return (selected_pop_index)
#_____________________________________________________________________________
def pareto_front_finding(fitness_values, pop_index):
pop_size = fitness_values.shape[0]
...
for i in range(pop_size):
for j in range(pop_size):
...
return pop_index[pareto_front]
#_____________________________________________________________________________
def selection(pop, fitness_values, pop_size):
pop_index_0 = np.arange(pop.shape[0])
pop_index = np.arange(pop.shape[0])
pareto_front_index = []
...
selected_pop = pop[pareto_front_index.astype(int)]
return selected_pop
#_____________________________________________________________________________
# Parameters
nv = 3
lb = [-5, -5, -5]
ub = [5, 5, 5]
pop_size = 150
rate_crossover = 20
rate_mutation = 20
rate_local_search = 10
step_size = 0.1
pop = random_population(nv,pop_size,lb,ub)
#_____________________________________________________________________________
# Main loop of NSGA II
for i in range(200):
offspring_from_crossover = crossover(pop,rate_crossover)
offspring_from_mutation = mutation(pop,rate_mutation)
...
fitness_values = evaluation(pop)
pop = selection(pop,fitness_values, pop_size)
print('iteration:', i)
#_____________________________________________________________________________
# Pareto front visualization
fitness_values = evaluation(pop)
index = np.arange(pop.shape[0]).astype(int)
pareto_front_index = pareto_front_finding(fitness_values, index)
pop = pop[pareto_front_index, :]
print("_________________")
print("Optimal solutions:")
print(" x1 x2 x3")
print(pop) # show optimal solutions
fitness_values = fitness_values[pareto_front_index]
print("______________")
print("Fitness values:")
print(" objective 1 objective 2")
print(" | |")
print(fitness_values)
plt.scatter(fitness_values[:, 0],fitness_values[:, 1])
plt.xlabel('Objective function 1')
plt.ylabel('Objective function 2')
plt.show()
.
.
.
Sorry! This is only a half of the Python code.
Notice: It would take you from 1 to 3 hours to re-type the Python code yourself; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.
Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)
Download the whole Python code here (Membership Code ID: 021)
No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.
If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com
Thank you very much and good luck with your research!
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. In this post, I am going to show you my innovative version of…
Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…
Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…
View Comments
Thank you for sharing the information, Dr. It's very helpful. Happy New Year.
Many thanks Dr. Huali for your support. I've also received your donation of 50 euros. I really appreciate your financial help!!! By the way, your donation has been updated on the donation section of this blog. Happy New Year!
Sir, can we apply similar code to different objective functions by changing upper and lower boundaries?
what else we need to change when we apply for a different problem?
Hi, yes, it is possible. To solve a new unconstrained problem, we just need to update the objective function and upper/lower bounds. The rest of the code can be kept the same.
I tried to use different objective functions in the code.. I couldn't get any optimal solutions.
Yeah, for new objective functions, we often need some parameter tuning to maximize the performance
Hi
I am from Iran
How can I buy the code?
You can follow the instruction by clicking on the link at the end of the post.
Hi, I am working with this code for min objective functions. Does this code also works for max of objective functions? Or do we need to make any modifications?
Hi, we need to convert from min to max
hello sir, I am athulp, I already paid the fee for 021, but there is no option for downloading. can you help me
I have sent the code 021 to you by email.
Sir, how can we modify this code for multi-objective problems?
Hello Sir,
I paid for the code but did not get the link in my email to download the code.
Please contact me by email