Adaptive Re-Start Hybrid Genetic Algorithm in Matlab

Hello everyone!

Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid Genetic Algorithm works in solving this benchmark optimization problem.

Genetic Algorithm (GA) is a popular metaheuristic, stochastic optimization algorithm, based on the mechanisms of natural selection in Charles Darwin’s theory of natural evolution. GA was first introduced by Holland in 1975, and now it is still very popular in various research community. Did you know that GA is one of the most popular stochastic optimization algorithm often used to solve complex large scale optimization problems in various fields.

GA is one of the most general global optimisation solution methods used in countless number of research publications. However, like other search techniques, GA has weak theoretical guarantee of global optimal solution and can only offer a probabilistic guarantee. Having a GA capable of searching for the global optimal solution with very high success probability is always desirable. In this research, an innovative structure of GA, in which adaptive restart, local search, and chromosome elite transferring strategies are harmoniously integrated together, is proposed to improve the success rate of achieving global optimal solution of the traditional GA. The robustness of the proposed GA structure is demonstrated through a number of case studies and benchmark optimization problems.

Let’s see the Matlab code of my Adaptive Re-start Hybrid Genetic Algorithm, and how it works:

Matlab Code

population.m

function YY=population(p,nv,lb,ub)
% p = population size
% nv = number of variables
% lb = Lower bound
% ub = Upper bound
for i = 1:p
    for j = 1:nv
        Y(i,j)=(ub(j)-lb(j))*rand+lb(j);
    end
end
YY = Y;

crossover.m

function Y=crossover(P,n)
% P = population
% n = the number of pairs of chromosomes to be ...
% crossed (equivalent to the trational crossover rate)
[x1 y1]=size(P);
Z=zeros(2*n,y1);
for i = 1:n
    r1=randi(x1,1,2); % select 2 random parent chromosomes
    while r1(1)==r1(2)
        r1=randi(x1,1,2); % make sure 2 selected chromosomes are not the same
    end
    A1=P(r1(1),:); % parent 1
    A2=P(r1(2),:); % parent 2
    r2=1+randi(y1-1); % random cutting point
    B1=A1(1,r2:y1); % exchange
    A1(1,r2:y1)=A2(1,r2:y1); % exchange
    A2(1,r2:y1)=B1; % exchange
    Z(2*i-1,:)=A1;
    Z(2*i,:)=A2;
end
Y=Z;

mutation.m

function Y=mutation(P,n)
% P = population
% n = the number of pairs of chromosomes to ...
% be mutated (equivalent to the trational mutation rate)
[x1 y1]=size(P);
Z=zeros(2*n,y1);
for i = 1:n
    r1=randi(x1,1,2);
    while r1(1)==r1(2)
        r1=randi(x1,1,2);% make sure 2 selected chromosomes are not the same
    end
    A1=P(r1(1),:); % parent 1
    A2=P(r1(2),:); % parent 2
    r2=randi(y1); % random gene
    A0 = A1(r2); % exchange the selected gene
    A1(r2)=A2(r2); % exchange the selected gene
    A2(r2)=A0; % exchange the selected gene
    Z(2*i-1,:)=A1;
    Z(2*i,:)=A2;
end
Y=Z;

objective_function.m

function Y = objective_function(X)
% X = candidate solution
d = length(X);
sum = 0;
for z = 1:d
    xi = X(z);
    new = xi^4 -16*xi^2 + 5*xi;
    sum = sum + new;
end
Y = sum/2;

evaluation.m

function YY=evaluation(P,ot,co)
% P = population
% ot = optimization type, max or min
% co = coefficient for converting min to max problem (to make sure ...
% that the objective function is always positive
[x1 y1]=size(P);
H=zeros(1,x1);
for i = 1:x1
   H(i)= objective_function(P(i,:)); 
end
% depending on type of optimization
if ot == 1 % for maximization problem
    Y = H + co; % add co to make sure all elements in Y are positive
else       % for minimization problem
    K=zeros(1,x1);
    for i = 1:x1
        K(i) = 1/(co + H(i)); % convert from min to max
    end
    Y = K;
end
YY = Y;

local_search.m

function Y=local_search(X,s,lb,ub)
% X = current best chromosome
% s = step size
% lb = Lower bound
% ub = Upper bound

[x y]=size(X);
A=ones(2*y,1)*X;
j=1;
for i=1:y
        L1=X(1,j)+s*rand;
. 
.
.

selection.m

function [YY1 YY2] = selection(P,B,p,s)
% P - population
% B - fitness value 
% p - population size
% s = Keep top s chromsomes
%------------------------------
% Top selection operation
[x1 y1]=size(P);
Y1 = zeros(p,y1);
Fn = zeros(1,p);
for i =1:s
    [r1 c1]=find(B==max(B));
    Y1(i,:)=P(max(c1),:);
.
.
.

GA.m

clear all
clc
close all
tic
%--------------------------------------------------------------------------
nv = 2;       % number of variables
lb = [-5 -5]; % lower bound
ub = [5 5];   % upper bound
ot = -1;      % minimization ot = -1; maximization ot = 1
t = 5;        % computing time (s)
%--------------------------------------------------------------------------
% Parameters of GA
p=50;     % population size
c=10;     % the number of pairs of chromosomes to be crossed (equivalent to the trational crossover rate)
m=5;      % the number of pairs of chromosomes to be mutated (equivalent to the trational mutation rate)
rs=20;    % adaptive restart search process (generations)
g=2;      % keep top chromosomes
r=2;      % number of chromosomes in initial guess
ms = 0.2; % max step size for local search
co = 200;  % coefficient for converting min to max problem (to make sure 
% that the objective function is always positive
%-------------------------------------------------------------------
% Stoping criterion
tg=10000000; % number of generattion - set be be large to use computing time
%--------------------------------------------------------------------------
figure
xlabel('Generation')
ylabel('Objective function value')
if ot ==1
    title('Blue dots = Maximum value         Red dots = Average value');
else
    title('Blue dots = Minimum value         Red dots = Average value');
end
hold on

P1=population(r, nv, lb, ub); % Initial guess
w=1;
ww=1;
for j = 1:tg
    P=population(p-r, nv, lb, ub);
    P(p-r+1:p,:)=P1;
    for i=1:tg   
        % Extended population
        P(p+1:p+2*c,:)=crossover(P,c);
        P(p+2*c+1:p+2*c+2*m,:)=mutation(P,m);
        P(p+2*c+2*m+1:p+2*c+2*m+2*nv,:)=local_search(P(1,:),ms,lb,ub);
.
.
.
Sorry! This is only a half of the code.

Notice: It’s possible to watch the video and re-type the Matlab code yourself – that would take you from 1 to 3 hours; or with just €3.99 (the cost of a cup of coffee), you can download/copy the whole Matlab code within 3 minutes. It’s your choice to make.

Original price is €6.99 but today it’s only €3.99 (save €3 today – available for a limited time only)

Download the whole Matlab code here (Membership Code ID: 037)

No need to build the Matlab code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.

If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com

Thank you very much and good luck with your research!

Leave a Reply

Your email address will not be published. Required fields are marked *