Solving Optimization Problems

Multi-Start Genetic Algorithm (Python Code)

Hello everyone! In this video, I’m going to show you my python code of multi-start genetic algorithm (multi-start GA). Outperformance of this genetic algorithm is demonstrated in solving a famous benchmark global optimization problem, namely Eggholder function.

Genetic algorithm (GA) is one of the most popular stochastic optimization algorithm, often used to solve complex large scale optimization problems in various fields. Multi-start genetic algorithm is an improved version of traditional genetic algorithm which enhances the search capability for global optimization.

Here are details of the benchmark global optimization problem to be solved by this multi-start genetic algorithm.

Here is the general structure of the multi-start genetic algorithm. It should be noted that this genetic algorithm has one local search algorithm, which is integrated inside its structure. So, you can say: multi-start hybird genetic algorithm.

Let’s see how it works.

For more videos like this, check my YouTube channel here.

import numpy as np
import random
 
def objective_function(pop):
    fitness = np.zeros(pop.shape[0])
    for i in range(pop.shape[0]):
        x=pop[i]
        fitness[i] = 10e6-(-(x[1]+47)*np.sin(np.sqrt(abs(x[0]/2+(x[1]+47))))-x[0]*np.sin(np.sqrt(abs(x[0]-(x[1]+47)))))
    return fitness
 
def selection(pop, fitness, pop_size):
    next_generation = np.zeros((pop_size, pop.shape[1]))
    elite = np.argmax(fitness)
    next_generation[0] = pop[elite]  # keep the best
    fitness = np.delete(fitness,elite)
    pop = np.delete(pop,elite,axis=0)
    P = [f/sum(fitness) for f in fitness]  # selection probability
    index = list(range(pop.shape[0]))
    index_selected = np.random.choice(index, size=pop_size-1, replace=False, p=P)
    s = 0
    for j in range(pop_size-1):
        next_generation[j+1] = pop[index_selected[s]]
        s +=1
    return next_generation
 
def crossover(pop, crossover_rate):
    offspring = np.zeros((crossover_rate, pop.shape[1]))
    for i in range(int(crossover_rate/2)):
        r1=random.randint(0, pop.shape[0]-1)
        r2 = random.randint(0, pop.shape[0]-1)
        while r1 == r2:
            r1 = random.randint(0, pop.shape[0]-1)
            r2 = random.randint(0, pop.shape[0]-1)
        cutting_point = random.randint(1, pop.shape[1] - 1)
        offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
        offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
        offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
        offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]
    return offspring
 
def mutation(pop, mutation_rate):
    offspring = np.zeros((mutation_rate, pop.shape[1]))
    for i in range(int(mutation_rate/2)):
        r1=random.randint(0, pop.shape[0]-1)
        r2 = random.randint(0, pop.shape[0]-1)
        while r1 == r2:
            r1 = random.randint(0, pop.shape[0]-1)
            r2 = random.randint(0, pop.shape[0]-1)
        cutting_point = random.randint(0, pop.shape[1]-1)
        offspring[2*i] = pop[r1]
        offspring[2*i,cutting_point] = pop[r2,cutting_point]
        offspring[2*i+1] = pop[r2]
        offspring[2*i+1, cutting_point] = pop[r1, cutting_point]
    return offspring
 
def local_search(pop, fitness, lower_bounds, upper_bounds, step_size, rate):
    index = np.argmax(fitness)
    offspring = np.zeros((rate*2*pop.shape[1], pop.shape[1]))
    for r in range(rate):
        offspring1 = np.zeros((pop.shape[1], pop.shape[1]))
        for i in range(int(pop.shape[1])):
            offspring1[i] = pop[index]
.
.
.
Sorry! This is only a half of the code.

Notice: It’s possible to watch the video and re-type the Python code yourself – that would take you from 1 to 3 hours; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.

Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)

Download the whole Python Code here (Membership Code ID: 012)

No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.

If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com

Thank you very much and good luck with your research!

Dr.Panda

View Comments

Recent Posts

Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone! In this post, I am going to show you my innovative version of…

7 months ago

Test Your Understanding About Genetic Algorithm (Test 2)

Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…

7 months ago

Adaptive Restart Hybrid Genetic Algorithm

Hello everyone! In this post, I am going to show you my innovative version of…

7 months ago

Adaptive Re-Start Hybrid Genetic Algorithm (Test the Performance in Case Studies)

Hello everyone. In this post, I am going to show you my innovative version of…

1 year ago

Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…

1 year ago

Crypto Quiz (Test Your Knowledge About Cryptocurrency)

Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…

1 year ago