I forgot to put up the plot and the actual numerical visualization of the cost function. So here are the values:
Error in plotting Cost Function as a function of iterations
16 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Koustubh Phalak
am 20 Aug. 2018
Kommentiert: Koustubh Phalak
am 21 Aug. 2018
Hello all. I've been trying to implement Linear Regression with 2 features using Gradient Descent. The Gradient Descent works well numerically leading to optimal values of the Weight matrix and continuously decreasing Cost function with increasing number of iterations. But when I try to plot the graph of J vs iterations, I don't get the desired graph. Instead, all values upto the secondlast value are zero and last value is the actual minimum value of the cost function. So how do i fix this? Here's my code:
For Gradient Descent
function [W ,J_list] = gradDescent(W, X, y, alpha, num_iters)
m = length(y);
n = length(W);
J_list = zeros(num_iters,1);A = J_list;
T0 = 0;T1 = 0;T2 = 0;
for i=1:num_iters
T0 = W(1) -alpha*(((X * W) -y)' * X(:, 1));
T1 = W(2) -alpha*(((X * W) -y)' * X(:, 2));
T2 = W(3) -alpha*(((X * W) -y)' * X(:, 3));
W(1) = T0;
W(2) = T1;
W(3) = T2;
J_list(num_iters) = costFunction(X, y, W);
fprintf('[%.0f %.0f %.0f %.0f] \n', [W(1) W(2) W(3) J_list(num_iters)]);
end
plot(1:numel(J_list), J_list, '-b', 'LineWidth', 2);
end
For the Cost Function:
function [J] = costFunction(X,y, W)
m = length(y); %The number of training examples
n = length(W);
J = 0;
J = J + 0.5*sum((X*W-y).*(X*W-y));
end
Here is Feature Normalization:
function [X_norm, mu, sigma] = featureNormalize(X)
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));
mu = mean(X,1);
sigma = std(X,1);
for i = 1:size(X,1)
X_norm(i,:) = (X_norm(i,:) - mu)./(sigma);
end
And here is my main code:
clc; clear all; close all;
data = load('data.txt');
X = data(:,1:2);
y = data(:,3);
m = length(y);
[X mu sigma] = featureNormalize(X);
X = [ones(m,1), X];
W = rand(size(X,2),1)*10;
J = costFunction(X,y,W);
iterations =500;
alpha = 0.00001;
[W,J_list] = gradDescent(W,X,y,alpha,iterations);
All help is appreciated. TIA.
Akzeptierte Antwort
Weitere Antworten (0)
Siehe auch
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!