Hello everyone. In this video, I’m going to show you a Python code of my adaptive re-start hybrid genetic algorithm for global optimization.
Genetic Algorithm (GA) is one of the most general global optimisation solution methods used in many research fields. However, like other search techniques, GA has weak theoretical guarantee of global optimal solution and can only offer a probabilistic guarantee. Having a GA capable of searching for the global optimal solution with very high success probability is always desirable.
In this video, I will show you an innovative structure of GA, in which adaptive restart and local search mechanism are harmoniously integrated together, to improve the success rate of achieving global optimal solution of the algorithm. Robustness of this GA is demonstrated through a famous, difficult global optimization problem.
Here is the innovative structure of the GA:
Here is the benchmark testing problem. As you can see, this is a very difficult global optimization problem for search-based optimization algorithms. However, this proposed GA always finds the global optimal solution to the problem.
Let’s see how this GA works:
For more videos like this, check my YouTube channel here.
import numpy as np
import random
def objective_function(pop):
fitness = np.zeros(pop.shape[0])
for i in range(pop.shape[0]):
x = pop[i]
fitness[i] = 0.4 / (1 + 0.02 * ((x[0] - (-20)) ** 2 + (x[1] - (-20)) ** 2)) \
+ 0.2 / (1 + 0.5 * ((x[0] - (-5)) ** 2 + (x[1] - (-25)) ** 2)) \
+ 0.7 / (1 + 0.01 * ((x[0] - (0)) ** 2 + (x[1] - (30)) ** 2)) \
+ 1 / (1 + 2 * ((x[0] - (30)) ** 2 + (x[1] - (0)) ** 2)) \
+ 0.05 / (1 + 0.1 * ((x[0] - (30)) ** 2 + (x[1] - (-30)) ** 2))
return fitness
def selection(pop, fitness, pop_size):
next_generation = np.zeros((pop_size, pop.shape[1]))
elite = np.argmax(fitness)
next_generation[0] = pop[elite] # keep the best
fitness = np.delete(fitness, elite)
pop = np.delete(pop, elite, axis=0)
P = [f / sum(fitness) for f in fitness] # selection probability
index = list(range(pop.shape[0]))
index_selected = np.random.choice(index, size=pop_size - 1, replace=False, p=P)
s = 0
for j in range(pop_size - 1):
next_generation[j + 1] = pop[index_selected[s]]
s += 1
return next_generation
def crossover(pop, crossover_rate):
offspring = np.zeros((crossover_rate, pop.shape[1]))
for i in range(int(crossover_rate / 2)):
r1 = random.randint(0, pop.shape[0] - 1)
r2 = random.randint(0, pop.shape[0] - 1)
while r1 == r2:
r1 = random.randint(0, pop.shape[0] - 1)
r2 = random.randint(0, pop.shape[0] - 1)
cutting_point = random.randint(1, pop.shape[1] - 1)
offspring[2 * i, 0:cutting_point] = pop[r1, 0:cutting_point]
offspring[2 * i, cutting_point:] = pop[r2, cutting_point:]
offspring[2 * i + 1, 0:cutting_point] = pop[r2, 0:cutting_point]
offspring[2 * i + 1, cutting_point:] = pop[r1, cutting_point:]
return offspring
def mutation(pop, mutation_rate):
offspring = np.zeros((mutation_rate, pop.shape[1]))
for i in range(int(mutation_rate / 2)):
r1 = random.randint(0, pop.shape[0] - 1)
r2 = random.randint(0, pop.shape[0] - 1)
while r1 == r2:
r1 = random.randint(0, pop.shape[0] - 1)
r2 = random.randint(0, pop.shape[0] - 1)
cutting_point = random.randint(0, pop.shape[1] - 1)
offspring[2 * i] = pop[r1]
offspring[2 * i, cutting_point] = pop[r2, cutting_point]
offspring[2 * i + 1] = pop[r2]
offspring[2 * i + 1, cutting_point] = pop[r1, cutting_point]
return offspring
def local_search(pop, fitness, lower_bounds, upper_bounds, step_size, rate):
index = np.argmax(fitness)
offspring = np.zeros((rate * 2 * pop.shape[1], pop.shape[1]))
for r in range(rate):
offspring1 = np.zeros((pop.shape[1], pop.shape[1]))
for i in range(int(pop.shape[1])):
offspring1[i] = pop[index]
.
.
.
Sorry! This is only a half of the code.
Notice: It’s possible to watch the video and re-type the Python code yourself – that would take you from 1 to 3 hours; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Python code within 2 minutes. It’s your choice to make.
Original price is €9.99 but today it’s only €3.99 (save €6 today – available for a limited time only)
Download the whole Python code here (Membership Code ID: 014)
No need to build the Python code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.
If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com
Thank you very much and good luck with your research!
Dear sir, can you please send the code to my mail id. mohamed.nishad2011@gmail.com
Hi, Nishad, now this python code of my adaptive restart genetic algorithm is uploaded here. Thanks for watching and good luck with your research!
plz upload code
Thanks, I will try it ASAP
Hello, Mohan, now, the python code of my adaptive restart genetic algorithm is here. Thanks for watching!
can I get your code?Please E-mail to me ,Thanks
Thanks for your visit. I will upload it as soon as possible.
Hi, Terry, here is the python code of my adaptive restart genetic algorithm. Many thanks for watching and good luck with your research!
can I get a theory about this optimization?
You can get some info here: https://www.sciencedirect.com/science/article/abs/pii/S0957417417305572
Hello Doctor Panda
Can I get both Python code and MTLAB code of this optimization Algorithm?
Yes, both codes are posted on this blog.
Hello sir, do you have any virtual power plant optimization algorithm using python?
Hello, I don’t have that algorithm
Great job and thank you for sharing this. I was wondering if you could explain the difference between adaptive re-start genetic algorithm and the multi-start genetic algorithm, thank you.
They are very similar
I want to get the code of Adaptive Re-Start Hybrid Genetic Algorithm for Global Optimization (Python Code)
Hi, you can download yourself here …