Linear Regression exercise (Coursera course: ex1_multi)

2 visualizaciones (últimos 30 días)
PaulineJV
PaulineJV el 9 de Jul. de 2018
Hi there,
I am taking Andrew Ng's Coursera class on machine learning. After implementing gradient descent in the first exercise (goal is to predict the price of a 1650 sq-ft, 3 br house), the J_history shows me a list of the same value (2.0433e+09). So when plotting the results, I am left with a straight line giving me that single value of J.
  • Here is my code:
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
m = length(y);
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
delta = zeros(size(X,2),1);
for i=1:size(X,2)
delta(i,1)=sum((theta'*X'-y').*(X(:,i))')*alpha/m;
end
theta = theta - delta;
J_history(iter) = computeCostMulti(X, y, theta);
end
end
  • What I entered in the Command Window:
data = load('ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);
[X, mu, sigma] = featureNormalize(X)
X = [ones(m, 1) X];
std(X(:,2))
alpha = 0.01;
num_iters = 400;
theta = zeros(3, 1);
computeCostMulti(X,y,theta)
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
figure;
plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);
xlabel('Number of iterations');
ylabel('Cost J');
fprintf('Theta computed from gradient descent: \n');
fprintf(' %f \n', theta);
fprintf('\n');
I get the following result for theta: 340412.659574 110631.050279 -6649.474271
price = 0;
parameter = [1650,3];
parameter = (parameter-mu)./sigma;
parameter = [1,parameter];
price = theta'*parameter';
fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ... '(using gradient descent):\n $%f\n'], price);
I get the following result for the price: $293081.464335
  • Here is the result for J_history:
I'm not sure where the issue comes from. I've attached the exercise and functions to this post.
Thanks for your help!
Pauline

Respuestas (0)

Categorías

Más información sobre Introduction to Installation and Licensing en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by