Python Code of Multi-Objective Hybrid Genetic Algorithm (Hybrid NSGA-II)

Hello everyone and welcome!

In this post, I’m going to show you Python code of my Multi-Objective Hybrid Genetic Algorithm. This is also called Hybrid Non-Dominated Sorting Genetic Algorithm (Hybrid NSGA-II). This is a new and improved version of NSGA-II.

For more details about the famous NSGA-II, please have a look at this paper:

We develop Hybrid NSGA-II by adding a local search algorithm to NSGA-II [1] to improve the solution quality.

Here are the details of the benchmark function to test the performance of the developed hybrid NSGA-II. If you want to download the Python code of hybrid NSGA-II, check the link below. If you have any question or request, please let me know by leaving a comment.

Let’s see how my Hybrid NSGA-II works:

For more videos like this, check my YouTube channel here.

import random as rn
import numpy as np
import matplotlib.pyplot as plt
import math

#_____________________________________________________________________________
def random_population(nv,n,lb,ub):
    # nv = numver of variables
    # n = number of random solutions
    # lb = lower bound
    # ub = upper bound
    pop=np.zeros((n, nv)) 
    for i in range(n):
        pop[i,:] = np.random.uniform(lb,ub)

    return pop
#_____________________________________________________________________________
def crossover(pop, crossover_rate):
    offspring = np.zeros((crossover_rate, pop.shape[1]))
    for i in range(int(crossover_rate/2)):
        r1=np.random.randint(0, pop.shape[0])
        r2 = np.random.randint(0, pop.shape[0])
        while r1 == r2:
            r1 = np.random.randint(0, pop.shape[0])
            r2 = np.random.randint(0, pop.shape[0])
        cutting_point = np.random.randint(1, pop.shape[1])
        offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
        offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
        offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
        offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]

    return offspring
#_____________________________________________________________________________
def mutation(pop, mutation_rate):
    offspring = np.zeros((mutation_rate, pop.shape[1]))
    for i in range(int(mutation_rate/2)):
        r1=np.random.randint(0, pop.shape[0])
        r2 = np.random.randint(0, pop.shape[0])
        while r1 == r2:
            r1 = np.random.randint(0, pop.shape[0])
            r2 = np.random.randint(0, pop.shape[0])
        cutting_point = np.random.randint(0, pop.shape[1])
        offspring[2*i] = pop[r1]
        offspring[2*i,cutting_point] = pop[r2,cutting_point]
        offspring[2*i+1] = pop[r2]
        offspring[2*i+1, cutting_point] = pop[r1, cutting_point]

    return offspring
#_____________________________________________________________________________
def local_search(pop, n, step_size):
    # number of offspring chromosomes generated from the local search
    offspring = np.zeros((n, pop.shape[1]))
    for i in range(n):
        r1=np.random.randint(0,pop.shape[0])
        chromosome = pop[r1,:]
        ...  
      
        offspring[i,:] = chromosome
    return offspring
#_____________________________________________________________________________
def evaluation(pop):
    fitness_values = np.zeros((pop.shape[0], 2)) # because of 2 objective functions
    for i, chromosome in enumerate(pop):
        n = 3 # use the instance with n = 3
        for j in range(2):
            if j == 0:      # objective 1 
                fitness_values[i,j] = 1 - math.exp(-sum((chromosome - 1/math.sqrt(n))**2))

            elif j == 1:     # objective 2 
                fitness_values[i,j] = 1 - math.exp(-sum((chromosome + 1/math.sqrt(n))**2))
        
    return fitness_values
#_____________________________________________________________________________
def crowding_calculation(fitness_values):
    
    pop_size = len(fitness_values[:, 0])
    fitness_value_number = len(fitness_values[0, :])
    matrix_for_crowding = np.zeros((pop_size, fitness_value_number))
    ...

    crowding_distance = np.sum(matrix_for_crowding, axis=1) # crowding distance of each solution

    return crowding_distance
#_____________________________________________________________________________
def remove_using_crowding(fitness_values, number_solutions_needed):
  
    pop_index = np.arange(fitness_values.shape[0])
    crowding_distance = crowding_calculation(fitness_values)
    selected_pop_index = np.zeros((number_solutions_needed))
    selected_fitness_values = np.zeros((number_solutions_needed, len(fitness_values[0, :])))
    
    for i in range(number_solutions_needed):
        pop_size = pop_index.shape[0]
        solution_1 = rn.randint(0, pop_size - 1)
        ...
    selected_pop_index = np.asarray(selected_pop_index, dtype=int) # Convert the data to integer 
    
    return (selected_pop_index)
#_____________________________________________________________________________
def pareto_front_finding(fitness_values, pop_index):

    pop_size = fitness_values.shape[0]
    pareto_front = np.ones(pop_size, dtype=bool) # initially assume all solutions are in pareto front by using "1"
    for i in range(pop_size):
        for j in range(pop_size):
                ...
#_____________________________________________________________________________
def selection(pop, fitness_values, pop_size):
    
    pop_index_0 = np.arange(pop.shape[0])
    pop_index = np.arange(pop.shape[0])
    pareto_front_index = []
    
    while len(pareto_front_index) < pop_size:
        ...
#_____________________________________________________________________________
# Parameters
nv = 3
lb = [-4, -4, -4]
ub = [4, 4, 4]
pop_size = 100
rate_crossover = 30
rate_mutation = 20
rate_local_search = 30
step_size = 0.1
pop = random_population(nv,pop_size,lb,ub)
#_____________________________________________________________________________
# Main loop of NSGA II

for i in range(150):
    offspring_from_crossover = crossover(pop,rate_crossover)
    offspring_from_mutation = mutation(pop,rate_mutation)
    offspring_from_local_search = local_search(pop, rate_local_search, step_size)
    ...
#_____________________________________________________________________________
# Pareto front visualization
fitness_values = evaluation(pop)
index = np.arange(pop.shape[0]).astype(int)
pareto_front_index = pareto_front_finding(fitness_values, index)
pop = pop[pareto_front_index, :]
print("_________________")
print("Optimal solutions:")
print("     x1           x2           x3")
print(pop) # show optimal solutions
fitness_values = fitness_values[pareto_front_index]
print("______________")
print("Fitness values:")
print("objective 1  objective 2")
print("      |          |")
print(fitness_values)
plt.scatter(fitness_values[:, 0],fitness_values[:, 1])
plt.xlabel('Objective function 1')
plt.ylabel('Objective function 2')
plt.show()
.
.
.
Sorry! This is only a half of the code.

Notice: It would take you from 1 to 3 hours to re-type the Python code yourself; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.

Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)

Download the whole Python code here (Membership Code ID: 020)

No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.

If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com

Thank you very much and good luck with your research!

87 Replies to “Python Code of Multi-Objective Hybrid Genetic Algorithm (Hybrid NSGA-II)”

    1. Hi, yes. To solve new problems, we need to update the objective function and constraints. the rest can be kept the same.

  1. Respected Sir,
    I have two objective function, first objective is find the maximize second objective is find the minimize.
    which part I need to change in the code?
    If it can change, please tell me which points need to change.
    thank you so much!!

    1. Hi, for that problem, we need to change the “evaluation” and “pareto_front_finding”. If you have any difficulty, just let me know, I will make a video about this problem next week. Thanks

  2. Respected Sir, I have two objective function, first objective is find the maximize second objective is find the minimize. which part I need to change in the code? If it can change, please tell me which points need to change. thank you so much!!

    1. we must change the “pareto_front_finding” and “evaluation” at least, maybe more. I will make a video about solving problem with 1 min objective and 1 max objective. Thanks for the suggestion!

      1. Thank you so much !
        I already modify the evaluation and I already have my own fitness.
        So I have problem to find the pareto front.

  3. Respected Sir,
    I need to apply this to a MRP (material requirements planning) problem. Is it possible?

  4. The code you provided is mainly about numerical optimization. If we want to solve a multi-objective problem related to spatial location, in which parts would we make and change?

    1. We need to update the objective function and constraints. The rest can be kept the same. Thanks

  5. May I ask why the code you provided does not have parameters such as objective function, constraints and decision variables?

    1. It has the objective function and decision variables but it does not have constraints because the problem is unconstrained problem. The code works exactly the same as shown in the video

  6. Respected Sir,
    I need to apply this to three objective function, can i change it? if possible, which part I need to change in the code?

    1. It is possible to customize the code to solve a problem with three objective but it is quite long. Can’t explain here. I will make that video as soon as possible.

      1. Dear Sir,
        When would you find time to explain the code with three objective functions? I would be very grateful!

      2. Dear Sir,
        Did you already make a video for multi-objective optimization one to minimize and the second to maximize the function? i was trying to find a video but I did not find it. please let me know thanks.
        best,
        Hafiz

      3. We are still waiting sir for you to make a 3 objective video where 2 objectives need to be maximized and one minimized.

  7. Sir, Can you please make similar a multi-objective NSGA ii video with a constrained problem. As the video provided is suitable only for the unconstrained problem.

  8. Hi , i need help with a algorithm in phyton , include nsga ii and indicator based in estimators , i am interested that you could help me , and i could pay for your help

  9. Hi Son,
    Glad to see you here, and congratulations on your professional works.
    Is this code could be applicable for binary variables?
    Thanks

  10. I would like to do for initialization of list of s1,s2,s3,s4 with some elements .. for genetic Algorithm with tsp

    1. To handle the constraint in TSP, we need to design a special crossover for TSP. For more details, please have a look my GA code for TSP on this website.

  11. I want some help in understanding how to write an objective function when my mathematical model is ready. Is there something I could refer to?

    1. We should to the problem formulation first, i.e., math functions of the objective functions and constraints

  12. I would like help in understanding how to go about writing the objective function I have already drafted. Is there a possibility to get in touch with you?

  13. Hello…If I have a dataset that has 6 input variables and 2 outputs (for dual objectives), how do I configure this code? Also if my objective function is a predictive analytics model (say XGBoost1 and XGBoost2), how do I refer the same in the code? Can you please help? Many thanks in advance.

  14. I need suggestion please : I want to apply this algorithm to perform a segmentation of the images, where the output is the best order for the operations of Dilation and Erosion I really need advice.

  15. I want to apply multi objective genetic algorithm for missing data imputation, How can I do that? I really need some help here.

  16. Dear sir,
    Is it possible to have multiple input variables (more than 10) and a multipurpose function (more than 3)?

  17. Dear Sir,
    is it possible to have multi-input variables (more than 10) and multi-objective functions (more than 3)?

  18. How is it different from ‘Hybrid Non-Dominated Sorting Genetic Algorithm (Hybrid NSGA-II) in Python(December 13, 2020)’?

  19. Hi, Have you modified the code for Min and max objectives and constraints? Because I couldn’t find your videos for the problem with two objectives(One with min and the other with max) and constraints. Thank you

  20. Hi, I need this codes but I dont have any paypal account. Also Paypal is restricted in my country is there any way to buy or send you money to get codes ? thank you

  21. I paid for the code but can’t find the download button and when I click the link it just takes me back….

  22. Hi! Great work! Is it works for 1 obj function? I have obj function like sum(model – fact)^2, and many params in model which i want to optimize to fit with fact.

  23. Respected Sir, I have two objective function, first objective is find the maximize second objective is find the minimize.
    Can you make a video for such problems

Leave a Reply

Your email address will not be published. Required fields are marked *