How to integrate a trained LSTM neural network to a Simulink model?
5 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
CARLOS VIDAL
el 6 de Abr. de 2018
Respondida: tarkhani rakia
el 25 de Nov. de 2024
Hi, I have trained and tested a LSTM NN on Matlab 2018a, but I`m having problem to find a way to make my trained 'net' to integrate with a Simulink model. I have tried to create a Simulink block using 'gensim(net)' but it doesn`t support LSTM. If anyone found a way around that, I'll appreciate if you could share it. Thank you,
3 comentarios
Muhammad Faisal Khalid
el 16 de Oct. de 2021
Hi, I have trained and tested a LSTM NN on Matlab but do not know how to implement trained 'net' to integrate with my Simulink model.
anybody know?
David Willingham
el 18 de Oct. de 2021
You can use the Stateful predict, or Stateful classify to for using a trained LSTM with Simulink
Here are some links:
Respuesta aceptada
David Willingham
el 19 de Oct. de 2021
You can use the Stateful predict, or Stateful classify to for using a trained LSTM with Simulink
Here are some links:
0 comentarios
Más respuestas (3)
CARLOS VIDAL
el 10 de Abr. de 2018
Editada: CARLOS VIDAL
el 24 de Mayo de 2018
3 comentarios
Jiahao CHANG
el 21 de Mayo de 2021
Meeting the same error, just like Carlos said, its matrices dimentions issue. As a new of lstm, X here i think it's a matrix of time_steps*features, rather than the testing dataset you used in validation of this model.
tarkhani rakia
el 25 de Nov. de 2024
The way I found was to write a script, see below, using the LSTM equations and the weights and Bias from my previously trained NN, then create a function on Simulink to call the script with some small adaptations on the script below. It works really fine!
X=X_Test;
HiddenLayersNum=10;
LSTM_R=net.Layers(2,1).RecurrentWeights;
LSTM_W=net.Layers(2,1).InputWeights;
LSTM_b=net.Layers(2,1).Bias;
FullyConnected_Weights=net.Layers(3,1).Weights;
FullyConnected_Bias=net.Layers(3,1).Bias;
W.Wi=LSTM_W(1:HiddenLayersNum,:);
W.Wf=LSTM_W(HiddenLayersNum+1:2*HiddenLayersNum,:);
W.Wg=LSTM_W(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
W.Wo=LSTM_W(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
R.Ri=LSTM_R(1:HiddenLayersNum,:);
R.Rf=LSTM_R(HiddenLayersNum+1:2*HiddenLayersNum,:);
R.Rg=LSTM_R(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
R.Ro=LSTM_R(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
b.bi=LSTM_b(1:HiddenLayersNum,:);
b.bf=LSTM_b(HiddenLayersNum+1:2*HiddenLayersNum,:);
b.bg=LSTM_b(2*HiddenLayersNum+1:3*HiddenLayersNum,:);
b.bo=LSTM_b(3*HiddenLayersNum+1:4*HiddenLayersNum,:);
%LSTM - Layer
h_prev=zeros(HiddenLayersNum,1);%Output gate initial values (t-1)
c_prev=zeros(HiddenLayersNum,1);
i=1;
for i=1:length(X)
%Input Gate
z=W.Wi*X(:,i)+R.Ri*h_prev+b.bi;
I = 1.0 ./ (1.0 + exp(-z));%Input gate
%Forget Gate
f=W.Wf*X(:,i)+R.Rf*h_prev+b.bf;
F = 1.0 ./ (1.0 + exp(-f));%Forget gate
%Layer Input
g=W.Wg*X(:,i)+R.Rg*h_prev+b.bg;%Layer input
G=tanh(g);
%Output Layer
o=W.Wo*X(:,i)+R.Ro*h_prev+b.bo;
O = 1.0 ./ (1.0 + exp(-o));%Output Gate
%Cell State
c=F.*c_prev+I.*G;%Cell Gate
c_prev=c;
% Output (Hidden) State
h=O.*tanh(c);%Output State
h_prev=h;
% Fully Connected Layers
fc=FullyConnected_Weights*h+FullyConnected_Bias;
FC(:,i)=exp(fc)/sum(exp(fc)); %Softmax
end
[M,II] = max(FC);
YYY= categorical(II,[1 2 3 4 5]);%5 features
acc = sum(YYY == YY)./numel(YYY) %YY is the *reference* output data set used to calculate the accuracy of the LSTM when facing an unknown input data (X_test).
figure
plot(YYY,'.-')
hold on
plot(YY)
hold off
if true
% code
end
xlabel("Time Step")
ylabel("Activity")
title("Predicted Activities")
legend(["Predicted" "Test Data"])
0 comentarios
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!