Gradient Descent + Feature Scaling Produces Scaled Thetas
Mostrar comentarios más antiguos
Hi everyone,
Whenever I use feature scaling on my data, my gradient descent algorithm produces thetas that are different than what the normal equation gives me with the raw data.
However, when I use this same scaled data with the normal equation, I get the exact same theta values.
My gradient descent produces (using feature scaling): theta0 = 68121.597, theta1 = 14307.17
The normal equation gives me (with the raw, unscaled data): theta0 = 34136.17, theta1 = 6.599
HOWEVER, when I run the normal equation using the scaled data, I get the exact values I got in gradient descent with the scaled data: theta0 = 68121.597, theta1 = 14307.17
The feature scaling algorithm is this:
function scaled = featureScale(X)
totalSize = size(X);
standardDev = std(X);
means = mean(X);
scaled = ones(size(X));
for i = 1:totalSize(1, 2)
for u = 1:totalSize(1, 1)
scaled(u, i) = ((X(u, i) - means(1, i)) / standardDev(1, i));
end
end
end
I've tried other feature scaling techniques as well, even the mat2gray() function, but my thetas still seem incorrect.
I have a feeling that I probably have to un-scale my thetas somehow. Any advice? Thanks!
Respuestas (1)
Brad Hesse
el 6 de Jul. de 2016
Editada: Brad Hesse
el 6 de Jul. de 2016
Categorías
Más información sobre Linear Predictive Coding en Centro de ayuda y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!