unable to incorporate own design Loss function in r2024a
8 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Petr Kolar
el 26 de Ag. de 2024
Comentada: Petr Kolar
el 28 de Ag. de 2024
Switching from r2023b to r2024 I made some changes in my Net (CNN). e.g. modified input/output and replace RegressionLayer with SoftmaxLayer, using trainnet function, etc.
I expected better performance, perspective compatibility (RegressionLayre is not more recommended) and have a vision of my Net optimization with use of Prune approach etc.
To the contrary to the previous version I am not able to involve my own Loss function (as it was done in previous version).
The (siplified) code is below, the used synthax was inspired by example:
The error message is:
Error using trainnet (line 46)
Error calling function during training.
Error in callMyLoss (line 55)
myTrainedNet = trainnet(Y,target,net, @(Y,target) myOwnLoss(name,Y,target),options);
Caused by:
Error using myOwnLoss
The specified superclass 'nnet.layer.softmaxLayer' contains a parse error, cannot be found on MATLAB's
search
path, or is shadowed by another file with the same name.
Error in callMyLoss>@(Y,target)myOwnLoss(name,Y,target) (line 55)
myTrainedNet = trainnet(Y,target,net, @(Y,target) myOwnLoss(name,Y,target),options);
Error in nnet.internal.cnn.util.UserCodeException.fevalUserCode (line 11)
[varargout{1:nargout}] = feval(F, varargin{:});
classdef myOwnLoss < nnet.layer.softmaxLayer
% own Loss
methods
%function layer = sseClassificationLayer(name)
function layer = myOwnLoss(name)
% layer = sseClassificationLayer(name) creates a sum of squares
% error classification layer and specifies the layer name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = 'my own Loss v.2024a';
end
function loss = forwardLoss(layer, Y, T)
%%% function loss = forwardLoss(Yo, To)
% loss = forwardLoss(layer, Y, T) returns the Tdiff loss between
% the predictions Y and the training targets T.
disp("myLoss");
aa=1;
% just something very simple
loss = sum(Y-T,'all');
end
% original backwardLoss
function dX = backwardLoss(layer, Y, T)
numObservations = size( Y, 3);
dX = (Y - T)./numObservations;
end
end
end
%=======================eof=========================
classdef myOwnLoss < nnet.layer.softmaxLayer
% own Loss
methods
%function layer = sseClassificationLayer(name)
function layer = myOwnLoss(name)
% layer = sseClassificationLayer(name) creates a sum of squares
% error classification layer and specifies the layer name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = 'my own Loss v.2024a';
end
function loss = forwardLoss(layer, Y, T)
%%% function loss = forwardLoss(Yo, To)
% loss = forwardLoss(layer, Y, T) returns the Tdiff loss between
% the predictions Y and the training targets T.
disp("myLoss");
aa=1;
% just something very simple
loss = sum(Y-T,'all');
end
% original backwardLoss
function dX = backwardLoss(layer, Y, T)
numObservations = size( Y, 3);
dX = (Y - T)./numObservations;
end
end
end
%=======================eof=========================
0 comentarios
Respuesta aceptada
Malay Agarwal
el 26 de Ag. de 2024
Editada: Malay Agarwal
el 26 de Ag. de 2024
The error occurs since "softmaxLayer" is actually a function and cannot act as a base class for a class. You can confirm this by using the following command:
which softmaxLayer
And then opening the file path:
function layer = softmaxLayer( nameValueArgs )
% softmaxLayer Softmax layer
% ...
end
I also tried running your example code in MATLAB R2023b and am receiving the same error, suggesting that it does not work in R2023b as well.
According to the documentation, it is recommended to define a custom loss function as a MATLAB function instead of a class: https://www.mathworks.com/help/deeplearning/ug/define-custom-training-loops-loss-functions-and-networks.html#mw_40e667e2-1ea1-4793-b079-8bc763144200. The function should have the syntax "loss = f(Y,T)", where "Y" and "T" are the predictions and targets, respectively. Using a function has several benefits such as supporting automatic differentiation if all your operations are compatible with "dlarray".
If you want a more complex loss function which takes inputs other than the predictions and targets, you need to use a custom training loop with the custom loss function, as detailed in the following example: https://www.mathworks.com/help/deeplearning/ug/train-generative-adversarial-network.html#TrainGenerativeAdversarialNetworkGANExample-9.
Hope this helps!
7 comentarios
Malay Agarwal
el 28 de Ag. de 2024
Editada: Malay Agarwal
el 28 de Ag. de 2024
In your code, the following line:
netCNN_3dUnetL2 = trainnet(st1,pt1,net,@(st1,pt1) myLoss24a(st1,pt1),options);
Needs to be:
netCNN_3dUnetL2 = trainnet(st1,pt1,net,@myLoss24a,options);
Notice that only the name of the loss function is passed as a function handle with no arguments. MATLAB will automatically figure out what are the targets and the predictions that should be passed to the loss function based on the other arguments to the "trainnet" function.
I have attached a working version of the script which has a dummy loss function at the end. Please replace it with your own loss function. Also note that the loss function needs to return a scalar value.
Más respuestas (0)
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!