Get input/output gradient of neural network

9 visualizaciones (últimos 30 días)
Aaron Kandel
Aaron Kandel el 11 de Sept. de 2020
Comentada: Ruyue Yang el 22 de Jul. de 2021
Matlab's built in functions in the NN toolbox seem to provide a good set of options for getting the gradient of the network performance wrt the network parameters. Is there a way to get the gradient of the network output with respect to the network input?

Respuestas (1)

Mahesh Taparia
Mahesh Taparia el 14 de Sept. de 2020
Hi
In general, in any neural network, the network tries to learn the weights which can reduce the cost/ loss function. The gradients are updated iteratively by using the derivative of loss function with respect to weights.
Usually for a fix input, calculating gradients of loss with respect to input is not meaningful because if input is fix, then d(loss)/d(Input) is not defined. If the network is feed with 2 different input sequence, in this case you can find the gradient by calculating (Loss2-Loss1)/(X2-X1), where Loss is the value of network loss with respect to input X. There is no use of this while training the network.
Hope it will helps!
  6 comentarios
David Leather
David Leather el 24 de Nov. de 2020
Editada: David Leather el 24 de Nov. de 2020
This seems like an oversight. When applying the trained neural network to other applications, it is essential to be able to evaluate the gradient wrt to the output of the neural network, and not the loss function....
Ruyue Yang
Ruyue Yang el 22 de Jul. de 2021
Get the gradient dy/dx can be really trick for the trained multi-layer neural network (not that deep, maybe 3 or 4 layer). Such function can help a lot for the network's various application.

Iniciar sesión para comentar.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Productos


Versión

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by