Error custom training loop: Value to differentiate must be a traced dlarray scalar.

6 visualizaciones (últimos 30 días)
Is it possible to include a Blackbox and still use Automatic Differentiation in MATLAB?
I am trying to do the following.
1) I have 3 input features which are x,y and z locations computed using a custom function (getcondvects_n_k). M such examples. xyz is a dlarray of shape 3-by-M
xyz=dlarray(flip(getcondvects_n_k([], 3, val_vectors),2),'BC');
2) A NN will compute a value of either 0 or 1 for each example
layers = [
featureInputLayer(3,"Name","elementCenterLocations")
fullyConnectedLayer(20,"Name","fclayer1")
batchNormalizationLayer("Name","batchnorm1")
leakyReluLayer(0.3,"Name","leakyrelu1")
fullyConnectedLayer(1,"Name","fclayer2")
sigmoidLayer("Name","sigmoid")];
lgraph = layerGraph(layers);
dlnet=dlnetwork(lgraph);
3) Forward Pass
r=forward(dlnet,xyz);
4) Blackbox
The output from the NN is fed to a seperate function. It is like a custom loss function and computes Loss and derivative of wrt r i.e. dl_dr which is nx-by-ny-by-nz matrix.
R=reshape(double(extractdata(r)),nx, ny,nz);
[loss, dl_dr]=black_box(R, other_inputs);
5) Backward Pass
So I want to use dl_dr to update the weights of the NN
grad = dlgradient(dlarray(dl_dr(:)),dlnet.Learnables,'RetainData',true);
[dlnet,averageGrad,averageSqGrad] = adamupdate(dlnet,grad,averageGrad,averageSqGrad,loop,learnRate);
6) The Forward Pass, Blackbox and Backward Pass will be in a custom training loop.
I'm getting the error when dlgradient is called. Can you please suggest changes if any? There are no known outputs Y and the Blackbox has many steps that involves matrix inversion. The inputs to the Blackbox cannot be a dlarray.
But the equation relating loss and r is straight forward and hence it's derivative is also straight forward.

Respuesta aceptada

Mahesh Taparia
Mahesh Taparia el 29 de Jul. de 2021
Hi
You are converting the dlarray data to double and then again converting to dlarray to perform automatic differentiation.
To overcome the issue which you are facing, do not convert the dlarray data to double using extractdata function. Once the data is getting extracted then it won't hold the gradients and the property of automatic differentiation will lose. So one thing is you can try to find out some way to directly use input of black_box function in dlarray format, do the required computation on top of that and take the output as dlarray. It will solve the issue.
Hope it will help!
  1 comentario
vaishakh c
vaishakh c el 31 de Jul. de 2021
Thanks for your answer It was of great help. I made changes and got it to work.
1) I made a copy of the dlarray that I get from the neural network forward pass. I converted the copy to double and used it in the black_box.
2) Also I was passing a vector to the first argument to dlgradient function but it had to be a scalar. Which was my fault as loss functions are usually summed over all examples and therefore had to be a scalar.

Iniciar sesión para comentar.

Más respuestas (0)

Etiquetas

Productos


Versión

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by