Hello everyone! In this video, I’m going to show you my python code of multi-start genetic algorithm (multi-start GA). Outperformance of this genetic algorithm is demonstrated in solving a famous benchmark global optimization problem, namely Eggholder function.
Genetic algorithm (GA) is one of the most popular stochastic optimization algorithm, often used to solve complex large scale optimization problems in various fields. Multi-start genetic algorithm is an improved version of traditional genetic algorithm which enhances the search capability for global optimization.
Here are details of the benchmark global optimization problem to be solved by this multi-start genetic algorithm.
Here is the general structure of the multi-start genetic algorithm. It should be noted that this genetic algorithm has one local search algorithm, which is integrated inside its structure. So, you can say: multi-start hybird genetic algorithm.
Let’s see how it works.
For more videos like this, check my YouTube channel here.
import numpy as np
import random
def objective_function(pop):
fitness = np.zeros(pop.shape[0])
for i in range(pop.shape[0]):
x=pop[i]
fitness[i] = 10e6-(-(x[1]+47)*np.sin(np.sqrt(abs(x[0]/2+(x[1]+47))))-x[0]*np.sin(np.sqrt(abs(x[0]-(x[1]+47)))))
return fitness
def selection(pop, fitness, pop_size):
next_generation = np.zeros((pop_size, pop.shape[1]))
elite = np.argmax(fitness)
next_generation[0] = pop[elite] # keep the best
fitness = np.delete(fitness,elite)
pop = np.delete(pop,elite,axis=0)
P = [f/sum(fitness) for f in fitness] # selection probability
index = list(range(pop.shape[0]))
index_selected = np.random.choice(index, size=pop_size-1, replace=False, p=P)
s = 0
for j in range(pop_size-1):
next_generation[j+1] = pop[index_selected[s]]
s +=1
return next_generation
def crossover(pop, crossover_rate):
offspring = np.zeros((crossover_rate, pop.shape[1]))
for i in range(int(crossover_rate/2)):
r1=random.randint(0, pop.shape[0]-1)
r2 = random.randint(0, pop.shape[0]-1)
while r1 == r2:
r1 = random.randint(0, pop.shape[0]-1)
r2 = random.randint(0, pop.shape[0]-1)
cutting_point = random.randint(1, pop.shape[1] - 1)
offspring[2*i, 0:cutting_point] = pop[r1, 0:cutting_point]
offspring[2*i, cutting_point:] = pop[r2, cutting_point:]
offspring[2*i+1, 0:cutting_point] = pop[r2, 0:cutting_point]
offspring[2*i+1, cutting_point:] = pop[r1, cutting_point:]
return offspring
def mutation(pop, mutation_rate):
offspring = np.zeros((mutation_rate, pop.shape[1]))
for i in range(int(mutation_rate/2)):
r1=random.randint(0, pop.shape[0]-1)
r2 = random.randint(0, pop.shape[0]-1)
while r1 == r2:
r1 = random.randint(0, pop.shape[0]-1)
r2 = random.randint(0, pop.shape[0]-1)
cutting_point = random.randint(0, pop.shape[1]-1)
offspring[2*i] = pop[r1]
offspring[2*i,cutting_point] = pop[r2,cutting_point]
offspring[2*i+1] = pop[r2]
offspring[2*i+1, cutting_point] = pop[r1, cutting_point]
return offspring
def local_search(pop, fitness, lower_bounds, upper_bounds, step_size, rate):
index = np.argmax(fitness)
offspring = np.zeros((rate*2*pop.shape[1], pop.shape[1]))
for r in range(rate):
offspring1 = np.zeros((pop.shape[1], pop.shape[1]))
for i in range(int(pop.shape[1])):
offspring1[i] = pop[index]
.
.
.
Sorry! This is only a half of the code.
Notice: It’s possible to watch the video and re-type the Python code yourself – that would take you from 1 to 3 hours; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.
Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)
Download the whole Python Code here (Membership Code ID: 012)
No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.
If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com
Thank you very much and good luck with your research!
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. In this post, I am going to show you my innovative version of…
Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…
Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…
View Comments
Hi Panda,
Great work, thank for sharing!!
Could you share the code? I'm working with GA in python, but the code have really bad performance.
Congrats for your work!!
Hi, thanks for your interest. I will share it as soon as possible.
Hi Bruno, here is the python code of my multi-start genetic algorithm. Thanks
Hello, Please I need to have a chat with you.
Could you please send me your email @ dipokoya2003@yahoo.com
. I need your service.
Hi, here is my email: learnwithpanda2018@gmail.com
Please, we need to talk. I have sent you an email. I want to download some of your codes for practices and to see how they work. Is the discount still available. However, I needed a bit documentation or explanation. Could you please help with that.
Just get in touch via email
Hello, please check your email. Many thanks for your interest!
Can someone assist, i need a local search algorithm for knapsack problem in R/python