Hello everyone and welcome.
In this video, I’m going to show you how to solve nonlinear constrained optimization problems with Matlab.
This optimization solution method is based on an optimization solver named “fmincon solver” in Matlab. To enhance the robustness of the solver, I use a simple code for running the solver multiple times (say: 10 times), each with random initial solution, and the final solution is the best solution found in all the runs. Thereby, the chance to get the global optimal solution is significantly improved.
Here is the detail of the benchmark function.
Let’s see how it works.
For more videos like this, check my YouTube channel here.
Matlab code
objective_function.m
function Output = objective_function(Input)
x = Input(1);
y = Input(2);
Output = sin(y)*exp((1-cos(x))^2) + cos(x)*exp((1-sin(y))^2) + (x-y)^2;
nonlinear_constraint.m
function [C Ceq] = nonlinear_constraint(Input)
x = Input(1);
y = Input(2);
C = (x+5)^2 + (y+5)^2 - 25; % inequality constraint
Ceq = []; % no equality constraint
fminconsolver.m
function [x,fval,exitflag,output,lambda,grad,hessian] = fminconsolver(x0,lb,ub)
options = optimoptions('fmincon');
options = optimoptions(options,'Display', 'off');
options = optimoptions(options,'PlotFcn', { @optimplotfval @optimplotconstrviolation });
[x,fval,exitflag,output,lambda,grad,hessian] = ...
fmincon(@objective_function,x0,[],[],[],[],lb,ub,@nonlinear_constraint,options);
main_code.m
clc
clear all
close all
%--------------------------------------------------------------------------
% customization: lower bound lb = [1xn], and upper bound lb = [1xn]
lb = [-10 -6.5];
ub = [0 0];
%--------------------------------------------------------------------------
n=10; % number of starts
[x y]=size(lb);
S=zeros(n,y);
R=zeros(n,1);
% multi-start optimization solver
for j=1:n
for i = 1:y
x0(i)=lb(i) + (ub(i)-lb(i))*rand; % generate initial solution
end
[x,fval,exitflag,output,lambda,grad,hessian] = fminconsolver(x0,lb,ub);
S(j,:)=x;
R(j,1)=fval;
end
[x1 y1]=find(R == min(R));
objective_function_value = R(min(x1),1)
optimal_solution = S(min(x1),:)
P/s: If you find the post useful, share it to remember and to help other people as well.
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. Let’s take a test to check your understanding about genetic algorithm, with multiple…
Hello everyone! In this post, I am going to show you my innovative version of…
Hello everyone. In this post, I am going to show you my innovative version of…
Hello everyone! Let’s see how my innovative version of Genetic Algorithm, called Adaptive Re-start Hybrid…
Hello everyone! Let’s take a short quiz, to test your knowledge about crypto-currency, or crypto.…
View Comments
How do you solve an optimisation problem. Where one of the constraints is a logarithmic inequality.
Solving principle is the same as in this video. Just need to update the constraint