I understand that you are encountering the error "Value to differentiate is not traced" while training your custom neural network in MATLAB. I too faced the same issue when reproducing the provided code. This issue arises because gradient tracking is broken inside the “modelLoss” function due to the use of “extractdata” and “fitlme” functions.
To resolve this, the following workarounds are needed in the provided code:
- Remove “extractdata” function to maintain gradient tracking.
- Replace “fitlme” with a custom regression function compatible with “dlarray”.
- Use element-wise operations (.* instead of *) so that “dlarray” is compatibility.
Kindly refer to the following corrected code:
fprintf('The %.dth epoch.\n', i);
iteration = iteration + 1;
X = dlarray(data{:,'x'},'BC');
Y = dlarray(data{:,'y'},'BC');
[loss,gradients] = dlfeval(@modelLoss, net, X, Y);
[net,trailingAvg,trailingAvgSqD] = adamupdate(net, gradients, ...
trailingAvg, trailingAvgSqD, iteration, ...
learnRate, gradientDecayFactor, squaredGradientDecayFactor);
The corrected "modelLoss" function:
function [loss,gradients] = modelLoss(net, X, Y)
output = forward(net, X);
n_neg = sum(mask_neg, 'all');
n_pos = sum(mask_pos, 'all');
if n_neg >= 5 && n_pos >= 5
beta_neg = calculateBeta(X_neg(mask_neg), Y_neg(mask_neg));
pred_neg = X_neg .* beta_neg;
loss_neg = sum((Y_neg - pred_neg).^2, 'all');
beta_pos = calculateBeta(X_pos(mask_pos), Y_pos(mask_pos));
pred_pos = X_pos .* beta_pos;
loss_pos = sum((Y_pos - pred_pos).^2, 'all');
loss = loss_neg + loss_pos;
beta = calculateBeta(X, Y);
loss = sum((Y - pred).^2, 'all');
gradients = dlgradient(loss, net.Learnables);
Helper function:
function beta = calculateBeta(X, Y)
This implementation solves the error, and gradient tracking will be maintained throughout the training loop.
For further reference, please check the following MATLAB documentations on “dlarray” and “dlradient”:
I hope this helps!
(As can be seen in the screenshot below, the error has been resolved, and training is proceeding)