Gradient Descent + Feature Scaling Produces Scaled Thetas

Hi everyone,
Whenever I use feature scaling on my data, my gradient descent algorithm produces thetas that are different than what the normal equation gives me with the raw data.
However, when I use this same scaled data with the normal equation, I get the exact same theta values.
My gradient descent produces (using feature scaling): theta0 = 68121.597, theta1 = 14307.17
The normal equation gives me (with the raw, unscaled data): theta0 = 34136.17, theta1 = 6.599
HOWEVER, when I run the normal equation using the scaled data, I get the exact values I got in gradient descent with the scaled data: theta0 = 68121.597, theta1 = 14307.17
The feature scaling algorithm is this:
function scaled = featureScale(X)
totalSize = size(X);
standardDev = std(X);
means = mean(X);
scaled = ones(size(X));
for i = 1:totalSize(1, 2)
for u = 1:totalSize(1, 1)
scaled(u, i) = ((X(u, i) - means(1, i)) / standardDev(1, i));
end
end
end
I've tried other feature scaling techniques as well, even the mat2gray() function, but my thetas still seem incorrect.
I have a feeling that I probably have to un-scale my thetas somehow. Any advice? Thanks!

Respuestas (1)

Brad Hesse
Brad Hesse el 6 de Jul. de 2016
Editada: Brad Hesse el 6 de Jul. de 2016
I am answering my own question because I figured it out! At least partially for now.
I still haven't found out how to un-scale my thetas. However, I can still make meaningful predictions by normalizing my X axis.
So when I want to plot my new prediction line, all I have to do is graph this function:
function y = f(x, thetas, unscaledX)
mins = min(unscaledX);
maxs = max(unscaledX);
y = thetas(1, 1) + (thetas(1, 2) * ((x - mins) / (maxs - mins)));
end

Preguntada:

el 6 de Jul. de 2016

Editada:

el 6 de Jul. de 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by