How to calculate gradient of output of a neural network with respect to its parameters?

17 visualizaciones (últimos 30 días)
I am new to the Deep Learning toolbox. I am working on a Reinforcement Learning problem wherein I need to calculate the derivative of the output of NN with respect to parameters.
More specifically, let the I/O relation of the neural network be defined as , where x is the input, y is the output, and θ contains the weights and biases of the neural network. For a specific input , I am interested in calculating . Any idea how I should go about this with the deep learning toolbox?

Respuesta aceptada

Jon Cherrie
Jon Cherrie el 27 de Abr. de 2021
You can use dlgradient and dlfeval to take derivatives with respect to the input or to the parameters
Here's an example:
I'm going to let this function play the role of the neural network. You should use dlnetwork to create a more complex network.
function y = f(x,theta)
y = sum(sin(theta(1)*x+theta(2)));
end
Then I need to scope the computation of the function so that dlfeval knows where to apply auto-diff. I do that by defining a function that evaluates the network and computes the gradient of interest. In this case:
function [y, dy] = fun_and_deriv(x,theta)
y = f(x,theta);
dy = dlgradient(y,theta);
end
Then I can compute the derivative:
x = dlarray(0:10);
theta = dlarray([1 2]);
[y, dy] = dlfeval(@fun_and_deriv,x,theta)
which gives:
y =
1×1 dlarray
-0.9668
dy =
1×2 dlarray
0.6788 -1.1095

Más respuestas (0)

Productos


Versión

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by