Hello everyone!
In this post, I am going to show you my innovative version of Genetic Algorithm, which I call: Adaptive Re-start Hybrid Genetic Algorithm. This Genetic Algorithm is coded in Matlab.
Let’s see how this Genetic Algorithm works in solving this benchmark optimization problem.
Genetic Algorithm (GA) is a popular meta-heuristic and stochastic optimization algorithm, based on the mechanisms of natural selection in Charles Darwin’s theory of natural evolution. GA was first introduced by Holland in 1975, and now it is still very popular in various research communities. Did you know that GA is one of the most popular stochastic optimization algorithms often used to solve complex large scale optimization problems in various fields.
GA is one of the most general global optimization solution methods used in countless number of research publications. However, like other search techniques, GA has weak theoretical guarantee of global optimal solution and can only provide a probabilistic guarantee. Having a GA capable of searching for the global optimal solution with very high success probability is always desirable. In this research, an innovative structure of GA, in which adaptive restart, local search, and chromosome elite transferring strategies are harmoniously integrated together, is proposed to improve the success rate of achieving global optimal solution of the traditional GA. The robustness of the proposed GA structure is demonstrated through a number of case studies and benchmark optimization problems.
Here are the details of benchmark optimization problems, with the given global optimal solution, and the minimum global objective function value. It should be noted that, in this video, we’ve chosen the dimensions of the problem, d = 10; Which means that, this benchmark is a 10 dimensional global optimization problem.
Please have a quick look at some important genetic operators, and the related Matlab codes. Here is the initial random population of the GA, which consists of a number of random chromosomes. Did you know that, in GA, a chromosome is a data structure, used to represent a potential solution to a given problem.
Here is the crossover operator. Did you know that crossover is a key genetic operator that mimics the process of recombination in biological reproduction. It involves combining two parent solutions, often referred to as parent chromosomes, to create new offspring chromosomes, which inherit characteristics from both parents. Crossover is one of the main methods by which Genetic Algorithms explore the solution space and promote the mixing of genetic material to generate new solutions.
Here is the mutation operator. Mutation is an operator that introduces random changes to individual solutions within a population. It is one of the key mechanisms for maintaining genetic diversity, preventing premature convergence, and exploring a broader solution space. Mutation in Genetic Algorithms draws inspiration from biological mutation, where changes occur in the genetic code due to various factors.
Here is the objective function. In a Genetic Algorithm, the objective function is a critical component that defines the goal of the algorithm. It is used to evaluate and measure the “fitness” or quality of individual solutions within a population. The objective function helps the Genetic Algorithm determine how well a given solution meets the problem’s requirements or optimizes a specific outcome.
Here is the selection operator. In a GA, “selection” refers to the process of choosing which individuals (or solutions) from the current population will be used to create the next generation. Selection is a crucial component of Genetic Algorithms because it determines which traits will be passed on and how the population evolves over time. It is inspired by the concept of natural selection, where individuals with favorable traits are more likely to survive and reproduce.
Here is the main program of the GA.
Now, let’s see how this Genetic Algorithm works:
Convergence of the Genetic Algorithm
Convergence in a Genetic Algorithm refers to the process by which the population of solutions evolves to a state where significant further improvements are unlikely, indicating that the algorithm is approaching a stable solution or optimal outcome. It reflects the degree to which the Genetic Algorithm has settled on a specific region of the solution space, often because the solutions have become similar or a specific pattern has emerged.
Convergence can be observed through various signals in a Genetic Algorithm.
Here are some indicators of convergence:
1. Stability in Fitness: When the best fitness or average fitness of the population remains relatively constant over several generations, it suggests that the algorithm is converging.
2. Reduced Genetic Diversity: As convergence occurs, the population tends to become more homogenous, with individuals sharing similar traits.
3. Repeated Solutions: Convergence can lead to situations where the same or similar solutions appear frequently in the population, indicating that the algorithm is focusing on a specific area of the solution space.
Here are some causes of convergence:
1. High Selection Pressure: Strong selection favoring the fittest individuals can lead to a rapid convergence to a specific solution or set of solutions.
2. Low Mutation Rate: Without sufficient mutation, genetic diversity can decrease, contributing to convergence.
3. Elitism: Keeping the best solutions across generations can accelerate convergence, though it can help maintain high-quality solutions.
Here are some benefits and risks of convergence:
1. Benefits: Convergence can indicate that the Genetic Algorithm has found a high-quality solution or is nearing the optimal solution. It can be a desired outcome, especially when a well-defined optimal solution exists.
2. Risks: Premature convergence occurs when the algorithm settles too quickly on a local optimum, missing the global optimum. This can happen if the genetic diversity is too low or if selection pressure is too high.
How do we manage the convergence?
1. Maintaining Diversity: Strategies like higher mutation rates, crossover techniques, or injecting random diversity can help delay convergence and encourage broader exploration.
2. Restarting or Adaptive Strategies: Some Genetic Algorithms incorporate restart mechanisms or adaptive techniques to combat premature convergence and encourage exploration.
3. Balancing Exploration and Exploitation: Genetic Algorithms must strike a balance between exploring new solutions and exploiting known high-fitness solutions to avoid premature convergence while still converging to a high-quality solution.
In summary
Convergence in a Genetic Algorithm is the process of the population stabilizing around a specific solution or region in the solution space. While convergence can be a positive outcome, indicating that the algorithm has found a good solution, premature convergence can be problematic, leading to sub-optimal results. Proper management of selection, mutation, and diversity is crucial to achieving effective convergence in Genetic Algorithms.
Matlab Code
population.m
function YY=population(p,nv,lb,ub)
% p = population size
% nv = number of variables
% lb = Lower bound
% ub = Upper bound
for i = 1:p
for j = 1:nv
Y(i,j)=(ub(j)-lb(j))*rand+lb(j);
end
end
YY = Y;
crossover.m
function Y=crossover(P,n)
% P = population
% n = the number of pairs of chromosomes to be ...
% crossed (equivalent to the trational crossover rate)
[x1 y1]=size(P);
Z=zeros(2*n,y1);
for i = 1:n
r1=randi(x1,1,2); % select 2 random parent chromosomes
while r1(1)==r1(2)% make sure 2 selected chromosomes are not the same
r1=randi(x1,1,2);
end
A1=P(r1(1),:); % parent 1
A2=P(r1(2),:); % parent 2
r2=1+randi(y1-1); % random cut point
B1=A1(1,r2:y1); % swap the 2 parts
A1(1,r2:y1)=A2(1,r2:y1); % swap the 2 parts
A2(1,r2:y1)=B1; % swap the 2 parts
Z(2*i-1,:)=A1;
Z(2*i,:)=A2;
end
Y=Z;
mutation.m
function Y=mutation(P,n)
% P = population
% n = the number of pairs of chromosomes to ...
% be mutated (equivalent to the trational mutation rate)
[x1 y1]=size(P);
Z=zeros(2*n,y1);
for i = 1:n
r1=randi(x1,1,2);
while r1(1)==r1(2)% make sure 2 selected chromosomes are not the same
r1=randi(x1,1,2);
end
A1=P(r1(1),:); % parent 1
A2=P(r1(2),:); % parent 2
r2=randi(y1); % random gene
A0 = A1(r2); % swap the selected gene
A1(r2)=A2(r2); % swap the selected gene
A2(r2)=A0; % swap the selected gene
Z(2*i-1,:)=A1;
Z(2*i,:)=A2;
end
Y=Z;
local_search.m
function Y=local_search(X,s,lb,ub)
% X = current best chromosome
% s = step size
% lb = Lower bound
% ub = Upper bound
[x y]=size(X);
A=ones(2*y,1)*X;
j=1;
for i=1:y
.
.
.
end
Y = A;
objective_function.m
function Y = objective_function(X)
d = length(X); % dimensions of the problem
% Parameters of the problem
a = 20;
b = 0.2;
c = 2*pi;
A1 = 0;
A2 = 0;
for i = 1:d
xi = X(i); % variable xi
A1 = A1 + xi^2;
A2 = A2 + cos(c*xi);
end
Y = -a*exp(-b*sqrt(A1/d)) - exp(A2/d) + a + exp(1);
end
evaluation.m
function YY=evaluation(P,ot,co)
% P = population
% ot = optimization type, max or min
% co = coefficient for converting min to max problem (to make sure ...
% that the objective function is always positive
[x1 y1]=size(P);
H=zeros(1,x1);
for i = 1:x1
H(i)= objective_function(P(i,:));
end
% depending on type of optimization
if ot == 1 % for maximization problem
Y = H + co; % add co to make sure all elements in Y are positive
else % for minimization problem
K=zeros(1,x1);
for i = 1:x1
K(i) = 1/(co + H(i)); % convert from min to max
end
Y = K;
end
YY = Y;
selection.m
function [YY1 YY2] = selection(P,B,p,s)
% P - population
% B - fitness value
% p - population size
% s = Keep top s chromsomes
%------------------------------
% Top selection operation
[x1 y1]=size(P);
Y1 = zeros(p,y1);
Fn = zeros(1,p);
for i =1:s
[r1 c1]=find(B==max(B));
Y1(i,:)=P(max(c1),:);
Fn(i)=B(max(c1));
P(max(c1),:)=[]; % remove
B(:,max(c1))=[]; % remove
end
%------------------------------
% Determine total fitness for the population
C=sum(B);
% Determine selection probability
D=B/C;
% Determine cumulative probability
E= cumsum(D);
N=rand(1);
d1=1;
d2=s;
while d2 <= p-1
if N <= E(d1)
.
.
.
end
YY1=Y1;
YY2=Fn;
GA.m
clear all
clc
close all
tic
%--------------------------------------------------------------------------
% The problem parameters:
nv = 10; % number of variables
lb = -32.768*ones(1,nv); % lower bound = -32.768
ub = 32.768*ones(1,nv); % upper bound = 32.768
ot =-1; % minimization ot = -1; maximization ot = 1
%--------------------------------------------------------------------------
% Parameters of the GA:
p=100; % population size
c=30; % the number of pairs of chromosomes to be crossed (equivalent to the trational crossover rate)
m=20; % the number of pairs of chromosomes to be mutated (equivalent to the trational mutation rate)
rs=10; % adaptive restart search process (generations)
g=5; % keep top chromosomes
r=5; % number of chromosomes in initial guess
msl=1; % max step size for local search
co=20; % coefficient for converting min to max problem (to make sure that the objective function is always positive
%--------------------------------------------------------------------------
% Termination criteria:
t =180; % computing time (s)
tg = 1000; % the number of generations of the GA
%--------------------------------------------------------------------------
figure
xlabel('Generation')
ylabel('Objective function value')
if ot ==1
title('Blue dots = Maximum value Red dots = Average value');
else
title('Blue dots = Minimum value Red dots = Average value');
end
hold on
P1=population(r, nv, lb, ub); % Initial guess
w=1;
ww=1;
i = 1;
j = 1;
while j <= tg
P=population(p-r, nv, lb, ub);
P(p-r+1:p,:)=P1;
ms = msl;
while i <= tg
% Extended population
P(p+1:p+2*c,:)=crossover(P,c);
P(p+2*c+1:p+2*c+2*m,:)=mutation(P,m);
P(p+2*c+2*m+1:p+2*c+2*m+2*nv,:)=local_search(P(1,:),ms,lb,ub);
.
.
.
Sorry! This is only a half of the Matlab code!
Do you want to get the whole Matlab code of my innovative Genetic Algorithm to use in your research, thesis, report, and publications?
It’s possible to watch the video and re-type the Matlab code yourself – that would take you from 1 to 3 hours; or with just €4.99 (the cost of a cup of coffee), you can download/copy the whole Matlab code within 3 minutes. It’s your choice to make.
Original price is €9.99 but today it’s only €4.99 (save €5 today – available for a limited time only)
Download the whole Matlab code here (Membership Code ID: 040)
No need to build the Matlab code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.
If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com
Thank you very much and good luck with your research!