Solving Optimization Problems

Python Code of Multi-Objective Hybrid Genetic Algorithm (Hybrid NSGA-II)

Hello everyone and welcome!

In this post, I’m going to show you Python code of my Multi-Objective Hybrid Genetic Algorithm. This is also called Hybrid Non-Dominated Sorting Genetic Algorithm (Hybrid NSGA-II). This is a new and improved version of NSGA-II.

For more details about the famous NSGA-II, please have a look at this paper:

We develop Hybrid NSGA-II by adding a local search algorithm to NSGA-II [1] to improve the solution quality.

Here are the details of the benchmark function to test the performance of the developed hybrid NSGA-II. If you want to download the Python code of hybrid NSGA-II, check the link below. If you have any question or request, please let me know by leaving a comment.

Let’s see how my Hybrid NSGA-II works:

For more videos like this, check my YouTube channel here.

import random as rn
import numpy as np
import matplotlib.pyplot as plt
import math

#_____________________________________________________________________________
def random_population(nv,n,lb,ub):
    # nv = numver of variables
    # n = number of random solutions
    # lb = lower bound
    # ub = upper bound
    pop=np.zeros((n, nv)) 
    for i in range(n):
        pop[i,:] = np.random.uniform(lb,ub)

    return pop
#_____________________________________________________________________________
def crossover(pop, crossover_rate):
    offspring = np.zeros((crossover_rate, pop.shape[1]))
    for i in range(int(crossover_rate/2)):
        r1=np.random.randint(0, pop.shape[0])
        r2 = np.random.randint(0, pop.shape[0])
        while r1 == r2:
            r1 = np.random.randint(0, pop.shape[0])
            r2 = np.random.randint(0, pop.shape[0])
        cutting_point = np.random.randint(1, pop.shape[1])
        offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
        offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
        offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
        offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]

    return offspring
#_____________________________________________________________________________
def mutation(pop, mutation_rate):
    offspring = np.zeros((mutation_rate, pop.shape[1]))
    for i in range(int(mutation_rate/2)):
        r1=np.random.randint(0, pop.shape[0])
        r2 = np.random.randint(0, pop.shape[0])
        while r1 == r2:
            r1 = np.random.randint(0, pop.shape[0])
            r2 = np.random.randint(0, pop.shape[0])
        cutting_point = np.random.randint(0, pop.shape[1])
        offspring[2*i] = pop[r1]
        offspring[2*i,cutting_point] = pop[r2,cutting_point]
        offspring[2*i+1] = pop[r2]
        offspring[2*i+1, cutting_point] = pop[r1, cutting_point]

    return offspring
#_____________________________________________________________________________
def local_search(pop, n, step_size):
    # number of offspring chromosomes generated from the local search
    offspring = np.zeros((n, pop.shape[1]))
    for i in range(n):
        r1=np.random.randint(0,pop.shape[0])
        chromosome = pop[r1,:]
        ...  
      
        offspring[i,:] = chromosome
    return offspring
#_____________________________________________________________________________
def evaluation(pop):
    fitness_values = np.zeros((pop.shape[0], 2)) # because of 2 objective functions
    for i, chromosome in enumerate(pop):
        n = 3 # use the instance with n = 3
        for j in range(2):
            if j == 0:      # objective 1 
                fitness_values[i,j] = 1 - math.exp(-sum((chromosome - 1/math.sqrt(n))**2))

            elif j == 1:     # objective 2 
                fitness_values[i,j] = 1 - math.exp(-sum((chromosome + 1/math.sqrt(n))**2))
        
    return fitness_values
#_____________________________________________________________________________
def crowding_calculation(fitness_values):
    
    pop_size = len(fitness_values[:, 0])
    fitness_value_number = len(fitness_values[0, :])
    matrix_for_crowding = np.zeros((pop_size, fitness_value_number))
    ...

    crowding_distance = np.sum(matrix_for_crowding, axis=1) # crowding distance of each solution

    return crowding_distance
#_____________________________________________________________________________
def remove_using_crowding(fitness_values, number_solutions_needed):
  
    pop_index = np.arange(fitness_values.shape[0])
    crowding_distance = crowding_calculation(fitness_values)
    selected_pop_index = np.zeros((number_solutions_needed))
    selected_fitness_values = np.zeros((number_solutions_needed, len(fitness_values[0, :])))
    
    for i in range(number_solutions_needed):
        pop_size = pop_index.shape[0]
        solution_1 = rn.randint(0, pop_size - 1)
        ...
    selected_pop_index = np.asarray(selected_pop_index, dtype=int) # Convert the data to integer 
    
    return (selected_pop_index)
#_____________________________________________________________________________
def pareto_front_finding(fitness_values, pop_index):

    pop_size = fitness_values.shape[0]
    pareto_front = np.ones(pop_size, dtype=bool) # initially assume all solutions are in pareto front by using "1"
    for i in range(pop_size):
        for j in range(pop_size):
                ...
#_____________________________________________________________________________
def selection(pop, fitness_values, pop_size):
    
    pop_index_0 = np.arange(pop.shape[0])
    pop_index = np.arange(pop.shape[0])
    pareto_front_index = []
    
    while len(pareto_front_index) < pop_size:
        ...
#_____________________________________________________________________________
# Parameters
nv = 3
lb = [-4, -4, -4]
ub = [4, 4, 4]
pop_size = 100
rate_crossover = 30
rate_mutation = 20
rate_local_search = 30
step_size = 0.1
pop = random_population(nv,pop_size,lb,ub)
#_____________________________________________________________________________
# Main loop of NSGA II

for i in range(150):
    offspring_from_crossover = crossover(pop,rate_crossover)
    offspring_from_mutation = mutation(pop,rate_mutation)
    offspring_from_local_search = local_search(pop, rate_local_search, step_size)
    ...
#_____________________________________________________________________________
# Pareto front visualization
fitness_values = evaluation(pop)
index = np.arange(pop.shape[0]).astype(int)
pareto_front_index = pareto_front_finding(fitness_values, index)
pop = pop[pareto_front_index, :]
print("_________________")
print("Optimal solutions:")
print("     x1           x2           x3")
print(pop) # show optimal solutions
fitness_values = fitness_values[pareto_front_index]
print("______________")
print("Fitness values:")
print("objective 1  objective 2")
print("      |          |")
print(fitness_values)
plt.scatter(fitness_values[:, 0],fitness_values[:, 1])
plt.xlabel('Objective function 1')
plt.ylabel('Objective function 2')
plt.show()
.
.
.
Sorry! This is only a half of the code.

Notice: It would take you from 1 to 3 hours to re-type the Python code yourself; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.

Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)

Download the whole Python code here (Membership Code ID: 020)

No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.

If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com

Thank you very much and good luck with your research!

Dr.Panda

View Comments

  • Respected Sir,
    Can we apply a similar code for NSGA-ii (multi-objective)? If yes, please can you tell me which points need to change.

  • Respected Sir,
    I have two objective function, first objective is find the maximize second objective is find the minimize.
    which part I need to change in the code?
    If it can change, please tell me which points need to change.
    thank you so much!!

    • Hi, for that problem, we need to change the "evaluation" and "pareto_front_finding". If you have any difficulty, just let me know, I will make a video about this problem next week. Thanks

  • Respected Sir, I have two objective function, first objective is find the maximize second objective is find the minimize. which part I need to change in the code? If it can change, please tell me which points need to change. thank you so much!!

  • Respected Sir,
    I need to apply this to a MRP (material requirements planning) problem. Is it possible?

  • The code you provided is mainly about numerical optimization. If we want to solve a multi-objective problem related to spatial location, in which parts would we make and change?

    • We need to update the objective function and constraints. The rest can be kept the same. Thanks

  • May I ask why the code you provided does not have parameters such as objective function, constraints and decision variables?

    • It has the objective function and decision variables but it does not have constraints because the problem is unconstrained problem. The code works exactly the same as shown in the video

  • Respected Sir,
    I need to apply this to three objective function, can i change it? if possible, which part I need to change in the code?

    • It is possible to customize the code to solve a problem with three objective but it is quite long. Can't explain here. I will make that video as soon as possible.

      • Dear Sir,
        When would you find time to explain the code with three objective functions? I would be very grateful!

      • Dear Sir,
        Did you already make a video for multi-objective optimization one to minimize and the second to maximize the function? i was trying to find a video but I did not find it. please let me know thanks.
        best,
        Hafiz

      • We are still waiting sir for you to make a 3 objective video where 2 objectives need to be maximized and one minimized.

  • Sir, Can you please make similar a multi-objective NSGA ii video with a constrained problem. As the video provided is suitable only for the unconstrained problem.

Recent Posts

Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone! In this post, I am going to show you my innovative version of…

5 months ago

Test Your Understanding About Genetic Algorithm (Test 2)

Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…

5 months ago

Adaptive Restart Hybrid Genetic Algorithm

Hello everyone! In this post, I am going to show you my innovative version of…

6 months ago

Adaptive Re-Start Hybrid Genetic Algorithm (Test the Performance in Case Studies)

Hello everyone. In this post, I am going to show you my innovative version of…

1 year ago

Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…

1 year ago

Crypto Quiz (Test Your Knowledge About Cryptocurrency)

Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…

1 year ago