Matlab Code of Hybrid Genetic Algorithm for Non-linear Constrained Optimization Problems

Hello everyone. I have successfully developed a powerful Hybrid Genetic Algorithm for non-linear constrained optimization problems. This Genetic Algorithm not only has a mechanism to restart its search process if it gets stuck in local optima, but also has a local search algorithm. Therefore, this Hybrid Genetic Algorithm is very robust for non-linear constrained global optimization problems.

In this video, first, I am going to run my Hybrid Genetic Algorithm to demonstrate its performance, and then I will show you its Matlab code, which you can copy, and customize to solve your problems. Let’s see!

For more videos like this, check my YouTube channel here.

File 1: population.m

function Y=population(p,nv,lb,ub)
% nv = number of variables
% p = population size
% lb = Lower bound
% ub= Upper bound
for i = 1:p
    for j = 1:nv
        Y(i,j)=(ub(j)-lb(j))*rand+lb(j);
    end
end
Y = Y;

File 2: crossover.m

function Y=crossover(P,n)
% P = population
% n = pair of chromosomes to be crossovered
[x1 y1]=size(P);
Z=zeros(2*n,y1);
for i = 1:n
    r1=randi(x1,1,2);
    while r1(1)==r1(2)
        r1=randi(x1,1,2);
    end
    A1=P(r1(1),:);
    A2=P(r1(2),:);
    r2=1+randi(y1-1);
.
.
.

File 3: mutation.m

function Y=mutation(P,n)
% P = population
% n = pair of chromosomes to be mutated
[x1 y1]=size(P);
Z=zeros(2*n,y1);
for i = 1:n
    r1=randi(x1,1,2);
    while r1(1)==r1(2)
        r1=randi(x1,1,2);
    end
    A1=P(r1(1),:);
    A2=P(r1(2),:);
    r2=1+randi(y1-1);
.
.
.

File 4: evaluation.m

function Y=evaluation(P,ot)
[x1 y1]=size(P);
H=zeros(x1,1);
for i = 1:x1
   H(i,1)= ObjectiveFunction(P(i,:)); 
end
if ot == 1
    Y = H;
else
    Y=10^6-H;
end

File 5: selection.m

function [YY1 YY2] = selection(P1,B,p,s)
% P1 - population
% B - fitness value 
% p - population size
% s = select top s chromsomes
%-------------------------------------------------------------------------
% Top selection operation
B=B';
for i =1:s
    [r1 c1]=find(B==max(B));
    Y1(i,:)=P1(max(c1),:);
    Fn(i)=B(max(c1));
.
.
.

File 6: constraints_and_repair.m

function Y = constraints_and_repair(X,nv,lb,ub)
% Customize here to solve your problem. The rest will be handled automatically
if sum(X.^2)<=40-0.00001 |sum(X.^2)>=40+0.00001 % tolerance of equality constraint
    A1=X(1:nv-1);
    A2=40-sum(A1.^2);
    if A2 < 0
        for j = 1:nv
            X(j)=(ub(j)-lb(j))*rand+lb(j); % generate new one
        end
    else
        X(nv)=sqrt(A2);
    end
end

if X(1)*X(2)*X(3)*X(4)< 25     % inequality constraint
    X(4)=25/(X(1)*X(2)*X(3));
end
Y=X;

File 7: local_search.m

function Y=local_search(X,s,lb,ub)
% X = current best solution
% step size
[x y]=size(X);
A=ones(2*y,1)*X;
j=1;
for i=1:y
        L1=X(1,j)+s*rand;
        if L1 > ub(i)
            L1 = ub(i);
.
.
.

File 8: nonlinear_constraints.m

function [c ceq]=nonlinear_constraints(X)
% Customize here to solve your problem. The rest will be handled automatically
c = -X(1)*X(2)*X(3)*X(4)+25; % <= 0 constraint
ceq = sum(X.^2)-40;          %  = 0 constraint

File 9: ObjectiveFunction.m

function Y = ObjectiveFunction(X)
% Customize here to solve your problem. The rest will be handled automatically
lb = [1 1 1 1];              % lower bound
ub = [5 5 5 5];               % upper bound
O = X(1)*X(4)*[X(1)+X(2)+X(3)]+X(3); % objective function here
%.............................................
%.............................................
P = penalty_function(X,lb,ub);
Y = O+100*P;

File 10: penalty_function.m

function Y = penalty_function(X,lb,ub)
[c ceq]=nonlinear_constraints(X);
if c <= 0
    c = 0;
else
    c = c;
end
V = c + abs(ceq);
% check the bounds
[x1 y1]=size(X);
H=zeros(2,y1);
for i = 1:y1
    if X(i)< lb(i)
        H(1,i)=abs(X(i)-lb(i));
.
.
.

File 11: fmincon_solver.m

function [x,fval,exitflag,output,lambda,grad,hessian] = fmincon_solver(x0,lb,ub,MaxIterations_Data)
%% This is an auto generated MATLAB file from Optimization Tool.

%% Start with the default options
options = optimoptions('fmincon');
%% Modify options setting
options = optimoptions(options,'Display', 'off');
options = optimoptions(options,'MaxIterations', MaxIterations_Data);
[x,fval,exitflag,output,lambda,grad,hessian] = ...
fmincon(@ObjectiveFunction,x0,[],[],[],[],lb,ub,@nonlinear_constraints,options);

File 12: GA.m

clear all
clc
close all
tic
%--------------------------------------------------------------------------
% Customize here to solve your problem. The rest will be handled automatically
nv =4;                 % number of variables
lb = [1 1 1 1];          % lower bound
ub = [5 5 5 5];            % upper bound
ot = -1;                   % minimization ot = -1; maximization ot = 1
t = 20;                    % computing time (s)
%--------------------------------------------------------------------------
% Maximize performance of the GA (optional)
p=20; % population size
c=10;  % crossover rate
m=5; % mutation rate
s=5;  % adaptive restart search process
g=3;  % keep top chromosomes
r=3; % number of chromosomes in initial guess
ms = 0.001; % max step size for local search
%-------------------------------------------------------------------
% Stoping criteria
tg=10000000; % number of generattion - set be be large to use computing time
%--------------------------------------------------------------------------
P1=population(r, nv, lb, ub); % Initial guess
w=1;

for j = 1:tg
    P=population(p-r, nv, lb, ub);
    P(p-r+1:p,:)=P1;
    for i=1:tg   
        % Extended population
        P(p+1:p+2*c,:)=crossover(P,c);
        P(p+2*c+1:p+2*c+2*m,:)=mutation(P,m);
.
.
.
Sorry! This is only a half of the code.

Notice: It’s possible to watch the video and re-type the Matlab code yourself – that would take you from 1 to 3 hours; or with just €2.99 (the cost of a cup of coffee), you can download/copy the whole Matlab code within 2 minutes. It’s your choice to make.

Original price is €4.99 but today it’s only €2.99 (save €2 today – available for a limited time only)

Download the whole Matlab code here (Membership Code ID: 003)

No need to build the Matlab code from scratch because it’s very time-consuming. My idol, Jim Rohn, once said: “Time is more value than money. You can get more money, but you cannot get more time”. If you think this code can be used in your research/teaching work, you should download it and then customize/modify/apply it to your work, without any obligation of citing the original source if you don’t want. However, redistribution (i.e., downloading the code/script here and then making it available on another site on the Internet) is strictly prohibited.

If you have any question or problem, please contact Dr. Panda by email: learnwithpanda2018@gmail.com

Thank you very much and good luck with your research!

20 Replies to “Matlab Code of Hybrid Genetic Algorithm for Non-linear Constrained Optimization Problems”

  1. Hi, I am trying to solve a multi-objective convex minimization problem using Genetic Algorithm but i am unable to do that. I am new to this algorithm ì, any help would be highly appreciated.

    1. There are 2 videos related to multi-objective optimization on this blog. Have a look at them and see how you can apply them to solve your problems.

      1. Thanks for your reply but both of those videos involve single objective that is either it shows how to minimize two functions together or maximize. In my model, i have to minimize one function whereas maximise the other at once. Please help me in this regard.

        1. So, in your case, we need to convert the max function into min function. And the procedure to solve it is still the same.

  2. Hi sir

    I really appreciate this website. Your videos are great and very helpful. I would like to take this opportunity to raise a question about neural network to be linked with GA for optimization. Once I’m done with my modelling in neural network, I wanted to ask how could I extract the model from matlab neural network tool and use it as objective function to be optimized using GA in matlab. Thank you so much in advance for answering my concern.

  3. Thanks for your reply but both of those videos involve single objective that is either it shows how to minimize two functions together or maximize. In my model, i have to minimize one function whereas maximise the other at once. Please help me in this regard.

  4. when I run the code, it shows a bug that 索引超出数组元素的数目(56) in selection.m, L29. Could you explain why there is a bug?Thank you!

  5. Hello,sir! Thanks for the code. I have paid and downloaded all codes, However, it doesn’t run successfully. Could you please tell me if you have updated them? Thank you.

    1. Hello and thanks for your interest! No, these codes are exactly the same as in the video. Maybe, the reason is Matlab version. I used Matlab version R2016a. Please this Matlab version to see if it can run successfully as shown in the video.

  6. Thanks for your help, Dr Panda. BTW, do you have GA codes for optimization problems with both integer and 0-1 variables? Thank you.

  7. Assignment cannot be performed because the index on the left is incompatible with the size on the right.
    KK(w,1)=sum(10^6-S2)/p;

  8. Hi, Dr panda.
    I have paid the fee, but I cannot download the code.
    Could you please send it directly to my email?

Leave a Reply

Your email address will not be published. Required fields are marked *