Fletcher Reeves conjugate method
20 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Hello,
My program is giving the right solution for the problem, but I believe it is doing unecessary steps. For a problem with initial point at [4 6], my code using conjugate method is doing more steps than when I try to solve the same problem using the steepest descent method.
-> Main function:
function [x_opt,f_opt,k] = conjugate_gradient (fob,g_fob,x0,tol_grad);
c0 = feval(g_fob,x0); % evaluate gradient at initial point
k = 0;
if norm(c0) < tol_grad
x_opt = x0; % optimum point
f_opt = feval(fob,x_opt); % cost function value
else
d0= -c0; % search direction
alfa0 = equal_interval_line_search(x0,d0,fob,0.5,1e-6); %line search (step size)
x1= x0+ alfa0*d0;
c1 = feval(g_fob,x1);
while norm(c1) > tol_grad
beta = (norm(c1)/norm(c0))^2;
d1= -c1+beta*d0;
alfa1 = equal_interval_line_search(x1,d1,fob,0.5,1e-6);
x2= x1+alfa1*d1;
c0=c1;
c1= feval(g_fob,x2);
d0=d1;
x1=x2;
k=k+1;
end
x_opt = x1;
f_opt = feval(fob,x_opt);
end
Cost function:
function f = fob_8_58(x);
f = 8*x(1)^2 + 8*x(2)^2 - 80*((x(1)^2+x(2)^2-20*x(2)+100)^0.5)- 80*((x(1)^2+x(2)^2+20*x(2)+100)^0.5)-5*x(1)-5*x(2);
->Gradient fuction:
function g = grad_fob_8_58(x)
g(1) = 16*x(1) - 80*x(1)/((x(1)^2+x(2)^2-20*x(2)+100)^0.5)- 80*x(1)/((x(1)^2+x(2)^2+20*x(2)+100)^0.5)-5;
g(2) =16*x(2) - 80*(x(2)-10)/((x(1)^2+x(2)^2-20*x(2)+100)^0.5)- 80*(x(2)+10)/((x(1)^2+x(2)^2+20*x(2)+100)^0.5)-5;
0 Kommentare
Antworten (1)
Matt J
am 17 Aug. 2017
One reason might be that you are not doing any restarts in your conjugate gradient implementation. In non-quadratic problems, the sequence of directions, d, will typically lose conjugacy as the iterations progress and you need to restart with d=-gradient from time to time.
5 Kommentare
Siehe auch
Kategorien
Mehr zu Fit Postprocessing finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!