Why MATLAB does not provide activations function for Recurrent Neural Networks
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
MATLAB has a function called "activations" that produces the activations of a specific layer in the SeriesNetwork: https://www.mathworks.com/help/nnet/ref/activations.html
However, it does not work with RNN sequence networks. So, Is there a way to have the activations of a specific layer of an RNN/LSTM network.
Thanks in Advanced
1 comentario
Stuart Whipp
el 10 de Nov. de 2018
Editada: Stuart Whipp
el 11 de Nov. de 2018
Can confirm this works with ReLU, LSTM & BiLSTM (also using custom regression output). As a trivial solution, why not slice your network at the desired layer - and then run predict command? There's no weight update so should be identical to extracting activations from a given layer. Copy and paste this to a .m file, hope it helps :)
function [activations] = myActivations(net,data,layer_no)
if ~isnumeric(layer_no)
warning('layer_no (3rd argument) should be an integer, representing index of layer activating')
elseif or(layer_no>(size(net.Layers,1)-1),layer_no<2)
warning(strcat('layer_no exceeds network size, select a number between 2 and ',num2str((size(net.Layers,1)-1))))
end
if string(class(net.Layers((size(net.Layers,1)))))=="nnet.cnn.layer.RegressionOutputLayer"
net_new=net.Layers([ 1:layer_no (size(net.Layers,1)) ]);
% pretty straightforward when a regression network
net_new=SeriesNetwork(net_new);
elseif layer_no==(size(net.Layers,1))-1
warning(strcat('layer_no exceeds network size, select a number between 2 and ',num2str((size(net.Layers,1)-2)),'. For Softmax, use multiple output arguments with =classify()'))
else
% We're going to have to cut off classificationOutput and replace with regression to convert layers back to a 'valid system' for predict command
net_cut=net.Layers(1:layer_no);
layers = [ ...
net_cut
regressionLayer]; % has to be a regression layer in order to be a 'valid system'
net_new=SeriesNetwork(layers);
end
activations=predict(net_new,data);
I've also come across this, though you'd need the activations from preceding layer to view LSTM layer's outputs as I understand (unless its the first layer in your network, in which case the net's inputs). https://uk.mathworks.com/matlabcentral/fileexchange/64863-getlstmoutput
My method above should work by defining the neural net's inputs regardless of the layer's output you want... as it will feedforward through all layers (besides the ones after layer_no which are discarded).
Respuestas (0)
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!