Can't train a custom neural network
    4 visualizaciones (últimos 30 días)
  
       Mostrar comentarios más antiguos
    
Hello,
I am trying to create a neural network of the type shown in the picture. This is a pre-structured network that mimics a certain equation. However, I am having difficulties with creating the network structure using the network function for custom networks.
Here is the code I currently have:
clear; clc; rng(0);
%% 1) Simulink'ten verileri al
simOut = sim('MSD_Model');   % Simulink modeli
u_data = simOut.get('u');         % To Workspace: u (vektör)
y_data = simOut.get('y');         % To Workspace: y (vektör)
u_data = u_data(:)';              % 1xN
y_data = y_data(:)';              % 1xN
% Hücre dizileri (1xN, her hücre skalar)
U = con2seq(u_data);
Y = con2seq(y_data);
%% 2) SSNN-benzeri özel RNN mimarisi
n = 1;   % input
s = 2;   % state (Layer 2)
h = 4;   % hidden S (Layer 1)
h2 = 2;  % hidden O (Layer 3)
m = 1;   % output
net = network;
net.numInputs  = 1;
net.numLayers  = 4;
% Bağlantı topolojisi
% L1 <- Input
% L1 <- L2(z^-1)  (x̂(t-1) geri besleme)
% L2 <- L1
% L3 <- L2
% L4 <- L3        (çıktı)
net.inputConnect = [1;0;0;0];
net.layerConnect = logical([ ...
    0 1 0 0;  % L1 <- L2 (delayed)
    1 0 0 0;  % L2 <- L1
    0 1 0 0;  % L3 <- L2
    0 0 1 0]);% L4 <- L3
net.outputConnect = [false false false true];
% Katman boyutları
net.inputs{1}.size = n;
net.layers{1}.size = h;   % Hidden S
net.layers{2}.size = s;   % State layer (x̂)
net.layers{3}.size = h2;  % Hidden O
net.layers{4}.size = m;   % Output
% Transfer fonksiyonları (SSNN uyumlu)
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'purelin';
net.layers{3}.transferFcn = 'tansig';
net.layers{4}.transferFcn = 'purelin';
% Gecikmeler
net.layerWeights{1,2}.delays = 1;  % x̂(t-1) -> Hidden S
net.layerWeights{2,1}.delays = 0;
net.layerWeights{3,2}.delays = 1;
net.layerWeights{4,3}.delays = 0;
% Biaslar
net.biasConnect = [1;1;1;1];
% Performans ve eğitim
net.performFcn = 'mse';
net.trainFcn   = 'trainlm';
net.trainParam.epochs    = 100;
net.trainParam.goal      = 1e-6;
net.trainParam.max_fail  = 12;
net.trainParam.showWindow = true;
% Zaman serisi bölme (blok halinde)
net.divideFcn = 'dividetrain';  % tüm veri train, erken durdurma yok
% Ön-konfigürasyon & init
net = configure(net, U, Y);
net = init(net);
%% 3) Eğitim
[net, tr] = train(net, U, Y);
%% 4) Tahmin ve performans
Y_pred = net(U);
mse_train = perform(net, Y(tr.trainInd), Y_pred(tr.trainInd));
fprintf('MSE (train): %.3e\n', mse_train);
% Diyagnostik grafikler
figure; plotperform(tr);
figure; plotregression(cell2mat(Y), cell2mat(Y_pred), 'Regression');
% Zaman serisi karşılaştırma
figure; hold on; grid on;
plot(cell2mat(Y)',      'LineWidth',1.5);
plot(cell2mat(Y_pred)', '--', 'LineWidth',1.5);
xlabel('Zaman (örnek)'); ylabel('Çıkış');
legend('Gerçek Y(t)','Tahmin Ŷ(t)'); title('SSNN-RNN Zaman Serisi Modelleme');
%% 5) Kaydet & Simulink bloğu üret
save('trained_SSNN.mat','net');
Ts = 0.01;                 % model örnekleme zamanınla eşleştir
gensim(net, Ts);           % Simulink bloğu üretir (mask altından I/O=1x1)
The problem is that I cannot start the training properly — it always stops at the second epoch. I also tried using a Layered Recurrent Network (layrecnet), but the problem remains the same.
Could you please help me understand what I am doing wrong?
Also, are there any good resources or documentation on how to create and train custom neural networks in MATLAB (beyond the standard predefined types)?
Thank you in advance!

1 comentario
  Matt J
      
      
 el 24 de Ag. de 2025
				Your network isn't exactly shallow. Since you seem to have the Deep Learning Toolbox, it might be better to implement and train the network in that framework (trainnet) instead. 
Respuestas (1)
  Shantanu
 el 2 de Sept. de 2025
        Hi Mohamad, 
I went through your architecture and code; you could consider following suggestions to fix the training issue 
1.Normalize the Data 
2.Try out different activation function like ‘tansig ‘in layer 2. 
3.The number of neurons in the hidden layers is too low. Increase the counts significantly to give the model enough capacity to learn the pattern. 
4.Use some other training algorithms like ‘trainbr ‘or ‘trainscg’. 
5.Try out with delay from Layer 2 to Layer 3 as 0. 
Also, here are some documentations that might help 
1 comentario
Ver también
Categorías
				Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!



