running took very long time for this coding . why so? am I write a wrong function?
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
risky amalia
am 21 Mär. 2020
Kommentiert: Rena Berman
am 14 Mai 2020
clc
clear
format long
% Function Definition (Enter your Function here):
syms X Y;
f = 100*((Y-X^2)^2+(1-X))^2;
% Initial Guess:
x(1) = -0.75;
y(1) = 1;
e = 10^(-8); % Convergence Criteria
i = 1; % Iteration Counter
% Gradient Computation:
df_dx = diff(f, X);
df_dy = diff(f, Y);
J = [subs(df_dx,[X,Y], [x(1),y(1)]) subs(df_dy, [X,Y], [x(1),y(1)])]; % Gradient
S = -(J); % Search Direction
% Minimization Condition:
while norm(J) > e
I = [x(i),y(i)]';
syms h; % Step size
g = subs(f, [X,Y], [x(i)+S(1)*h,y(i)+h*S(2)]);
dg_dh = diff(g,h);
h = solve(dg_dh, h); % Optimal Step Length
x(i+1) = I(1)+h(1)*S(1); % Updated x value
y(i+1) = I(2)+h(2)*S(2); % Updated y value
i = i+1;
J = [subs(df_dx,[X,Y], [x(i),y(i)]) subs(df_dy, [X,Y], [x(i),y(i)])]; % Updated Gradient
S = -(J); % New Search Direction
end
% Result Table:
Iter = 1:i;
X_coordinate = x';
Y_coordinate = y';
Iterations = Iter';
T = table(Iterations,X_coordinate,Y_coordinate);
% Plots:
contour3(f, 'Fill', 'On');
hold on;
plot(x,y,'*-r');
% Output:
print('Initial Objective Function Value: %d\n\n',subs(f,[X,Y], [x(1),y(1)]));
if (norm(J) < e)
print('Minimum succesfully obtained...\n\n');
end
print('Number of Iterations for Convergence: %d\n\n', i);
print('Point of Minima: [%d,%d]\n\n', x(i), y(i));
print('Objective Function Minimum Value Post-Optimization: %d\n\n', subs(f,[X,Y], [x(i),y(i)]));
disp(T);
1 Kommentar
Akzeptierte Antwort
Walter Roberson
am 21 Mär. 2020
Bearbeitet: Walter Roberson
am 21 Mär. 2020
f = 100*((Y-X^2)^2+(1-X))^2;
That is degree 8 in X.
g = subs(f, [X,Y], [x(i)+S(1)*h,y(i)+h*S(2)]);
That substitutes h linearly in X, so the degree of h will be the same as the maximum of the degree of X and Y, so g will end up degree 8 in h.
dg_dh = diff(g,h);
differentiate degree 8 to get degree 7. Which happens to be factorable into degree 4 and degree 3.
h = solve(dg_dh, h); % Optimal Step Length
Degree 7 so you get 7 solutions, some of which are complex valued.
x(i+1) = I(1)+h(1)*S(1); % Updated x value
y(i+1) = I(2)+h(2)*S(2); % Updated y value
The solutions are ordered, but they are ordered according to internal logic that is not documented anywhere. They are certainly not sorted by anything having to do with x and y. The branches associated with the order is going to change as the values change, and at some point it is going to happen to select complex branches according to the internal logic.
You need to think explicitly about how you are going to deal with those multiple roots.
I would suggest that you should be using different symbolic h for x and y directions, calculating the gradient, solving for the two different h variables. If you do this, you will get 8 possible values each for the two different variables. You can then use an algorithm to choose particular ones, such as eliminating the ones with complex solutions, and choosing the solution pair that has the highest sum-of-squared values.
6 Kommentare
Walter Roberson
am 21 Mär. 2020
clc
clear
format long
% Function Definition (Enter your Function here):
syms X Y;
f = 100*((Y-X^2)^2+(1-X))^2;
% Initial Guess:
x(1) = -0.75;
y(1) = 1;
e = 10^(-8); % Convergence Criteria
i = 1; % Iteration Counter
% Gradient Computation:
df_dx = diff(f, X);
df_dy = diff(f, Y);
J = [subs(df_dx,[X,Y], [x(1),y(1)]) subs(df_dy, [X,Y], [x(1),y(1)])]; % Gradient
S = -(J); % Search Direction
% Minimization Condition:
syms h k; % Step size
while norm(J) > e
I = [x(i),y(i)]';
g = subs(f, [X,Y], [x(i)+S(1)*h,y(i)+k*S(2)]);
dg_dh = diff(g,h);
dg_dk = diff(g,k);
sol = solve([dg_dh, dg_dk], [h, k]) ;
%expect about 8 solutions, some of which will be complex valued
H = double(sol.h);
K = double(sol.k);
%get rid of the complex-valued ones
mask = imag(H) == 0 & imag(K) == 0;
H = H(mask);
K = K(mask);
%now you still have to pick one, but which? Arbitrarily take the pair
%with the greatest sum-of-squares
[~, maxidx] = max(H.^2 + K.^2);
H = H(maxidx);
K = K(maxidx);
x(i+1) = I(1)+H*S(1); % Updated x value
y(i+1) = I(2)+K*S(2); % Updated y value
i = i+1;
J = [subs(df_dx,[X,Y], [x(i),y(i)]) subs(df_dy, [X,Y], [x(i),y(i)])]; % Updated Gradient
S = -(J); % New Search Direction
end
% Result Table:
Iter = 1:i;
X_coordinate = x';
Y_coordinate = y';
Iterations = Iter';
T = table(Iterations,X_coordinate,Y_coordinate);
% Plots:
M = 1;
xmin = min([-M; X_coordinate]);
xmax = max([M; X_coordinate]);
ymin = min([-M; Y_coordinate]);
ymax = max([M; Y_coordinate]);
xvec = linspace(xmin,xmax,50);
yvec = linspace(ymin,ymax,51);
fval = double(subs(subs(f, X, xvec), Y, yvec.'));
contour3(xvec, yvec, fval, 'Fill', 'On');
hold on;
plot(x,y,'*-r');
% Output:
fprintf('Initial Objective Function Value: %d\n\n',subs(f,[X,Y], [x(1),y(1)]));
if (norm(J) < e)
fprintf('Minimum succesfully obtained...\n\n');
end
fprintf('Number of Iterations for Convergence: %d\n\n', i);
fprintf('Point of Minima: [%d,%d]\n\n', x(i), y(i));
fprintf('Objective Function Minimum Value Post-Optimization: %d\n\n', subs(f,[X,Y], [x(i),y(i)]));
disp(T);
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Number Theory finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!