Using SVD for Dimensionality Reduction
18 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Serra Aksoy
el 29 de Mzo. de 2021
Respondida: Mahesh Taparia
el 2 de Abr. de 2021
Hello everyone.
I have a matrix that has 300 rows(samples) and 5000 columns(features).
I need to reduce the number of columns for classification.
As far as I know for using pca() function the number of samples should be greater than the number of features.
So I try to use Singular Value Decomposition function with below codes.
%Singular value decomposition of X;
[U, Sig, V]=svd(X);
sv=diag(Sig)
%for the distribution of singular values;
figure;
sv=sv/sum(sv);
stairs(cumsum(sv));
xlabel('singular values');
ylabel('cumulative sum');
I have two questions.
1) As i understand from the above figure i have to take approximately 250 singular values that it counts for 95% of my data.
So should I take first 250 singular values for creating a new data for classification?
How can i see the variance of each principal components like in pca() functions explained matrix to decide how many of them should i use?
2) After defining the number of principal components, I need to create a new matrix for classification.
Can I do this with below code? (for example with first two principal components)
new_matrix_for_classification = X*(V:,1:2);
Thanks in advance.
0 comentarios
Respuesta aceptada
Mahesh Taparia
el 2 de Abr. de 2021
Hi
For 2nd part, you can use the function pca to directly calculate the input with principal components. For example, in your case if you want 1st 2 components, then:
[coeff,score,latent] = pca(X);
new_matrix_for_classification = score(:,1:2); %score is representation in new space
Hope it will help!
0 comentarios
Más respuestas (0)
Ver también
Categorías
Más información sobre Dimensionality Reduction and Feature Extraction en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!