Using the trained LSTM weight parameters in MATLAB, the results obtained by calculating with a formula are inconsistent with the results obtained by directly calling the net.
9 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Dele
el 11 de En. de 2024
Comentada: Dele
el 11 de En. de 2024
Hello everyone
Matlab loop calls the LSTM network that has been trained slowly. I plan to extract the weight parameters and calculate them according to the formula, but the prediction results are always wrong. I have checked the normalization and structural parameters. What are the other possible reasons? There is another operation when matlab calls net.
HiddenLayersNum=128;
LSTM_R1=net.Layers(2,1).RecurrentWeights;
LSTM_W1=net.Layers(2,1).InputWeights;
LSTM_b1=net.Layers(2,1).Bias;
LSTM_R2=net.Layers(4,1).RecurrentWeights;
LSTM_W2=net.Layers(4,1).InputWeights;
LSTM_b2=net.Layers(4,1).Bias;
FC_Weights=net.Layers(6,1).Weights;
FC_Bias=net.Layers(6,1).Bias;
W.Wi1=LSTM_W1(1:HiddenLayersNum,:);
W.Wf1=LSTM_W1(HiddenLayersNum+1:2*HiddenLayersNum,:);
W.Wg1=LSTM_W1(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
W.Wo1=LSTM_W1(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
R.Ri1=LSTM_R1(1:HiddenLayersNum,:);
R.Rf1=LSTM_R1(HiddenLayersNum+1:2*HiddenLayersNum,:);
R.Rg1=LSTM_R1(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
R.Ro1=LSTM_R1(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
b.bi1=LSTM_b1(1:HiddenLayersNum,:);
b.bf1=LSTM_b1(HiddenLayersNum+1:2*HiddenLayersNum,:);
b.bg1=LSTM_b1(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
b.bo1=LSTM_b1(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
W.Wi2=LSTM_W2(1:HiddenLayersNum,:);
W.Wf2=LSTM_W2(HiddenLayersNum+1:2*HiddenLayersNum,:);
W.Wg2=LSTM_W2(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
W.Wo2=LSTM_W2(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
R.Ri2=LSTM_R2(1:HiddenLayersNum,:);
R.Rf2=LSTM_R2(HiddenLayersNum+1:2*HiddenLayersNum,:);
R.Rg2=LSTM_R2(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
R.Ro2=LSTM_R2(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
b.bi2=LSTM_b2(1:HiddenLayersNum,:);
b.bf2=LSTM_b2(HiddenLayersNum+1:2*HiddenLayersNum,:);
b.bg2=LSTM_b2(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
b.bo2=LSTM_b2(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
%LSTM - Layer
h_prev=zeros(HiddenLayersNum,1);%Output gate initial values (t-1)
c_prev=zeros(HiddenLayersNum,1);
%归一化
test_in = Test_input(:,2:end)';
P_train = (test_in-Train_data_meanVal')./Train_data_stdVal';
for i=1:length(P_train)
%LSTM1
%Input Gate
z=W.Wi1*P_train(:,i)+R.Ri1*h_prev+b.bi1;
I = 1.0 ./ (1.0 + exp(-z));%Input gate
%Forget Gate
f=W.Wf1*P_train(:,i)+R.Rf1*h_prev+b.bf1;
F = 1.0 ./ (1.0 + exp(-f));%Forget gate
%Layer Input
g=W.Wg1*P_train(:,i)+R.Rg1*h_prev+b.bg1;%Layer input
G=tanh(g);
%Output Layer
o=W.Wo1*P_train(:,i)+R.Ro1*h_prev+b.bo1;
O = 1.0 ./ (1.0 + exp(-o));%Output Gate
%Cell State
c=F.*c_prev+I.*G;%Cell Gate
c_prev=c;
% Output (Hidden) State
h=O.*tanh(c);%Output State
h_prev=h;
L1 = relu(h);
%LSTM2
%Input Gate
z=W.Wi2*L1+R.Ri2*h_prev+b.bi2;
I = 1.0 ./ (1.0 + exp(-z));%Input gate
%Forget Gate
f=W.Wf2*L1+R.Rf2*h_prev+b.bf2;
F = 1.0 ./ (1.0 + exp(-f));%Forget gate
%Layer Input
g=W.Wg2*L1+R.Rg2*h_prev+b.bg2;%Layer input
G=tanh(g);
%Output Layer
o=W.Wo2*L1+R.Ro2*h_prev+b.bo2;
O = 1.0 ./ (1.0 + exp(-o));%Output Gate
%Cell State
c=F.*c_prev+I.*G;%Cell Gate
c_prev=c;
% Output (Hidden) State
h=O.*tanh(c);%Output State
h_prev=h;
L2 = relu(h);
% Fully Connected Layers
fc=FC_Weights*L2+FC_Bias;
Predict_value(i) = fc.*Train_label_stdVal+Train_label_meanVal;
end
function output = relu(x)
output = max(0, x);
end
0 comentarios
Respuesta aceptada
Ayush Modi
el 11 de En. de 2024
Hi,
I understand you want to extract the weight parameters and calculate them according to the formula manually. However, the prediction results are incorrect. Assuming the normalization applied is correct, here are some of the other possible reasons which could lead to this issue:
• Sequence Processing - LSTM networks can process sequences in different ways (e.g., sequence-to-sequence, sequence-to-one). Make sure manual calculations match how the network was trained to process the input sequences.
• Weight and Bias Dimensions - The extracted weights and biases if organized incorrect could result in different predicted values. Make sure the weights and biases have the correct dimensions and are being applied correctly in your manual calculations.
• Numerical Precision - Differences in numerical precision can lead to varying results. MATLAB uses double-precision floating-point by default, but if your manual calculations use a different precision, this could cause discrepancies.
• Mathematical Errors - There could be a simple mathematical error in the manual implementation of the LSTM equations. Double-check the equations and matrix operations.
Hope this helps!
Más respuestas (0)
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!