How to Show the Weight or Bias in a Neural Network?

123 visualizaciones (últimos 30 días)
Kai Hao Then
Kai Hao Then el 2 de Mzo. de 2011
Comentada: Aswany el 6 de Dic. de 2023
How to show the weight/bias from every layer in my neural network? I am doing a feedforward neural network with 2 hidden layers. Furthermore, how to determine how many hidden layers should I use in a neural network? Currently I have 3 inputs and 1 output. When I want to increase the hidden layer to 3, an error occurred saying that I have not sufficient of input for 3 hidden layers.
  1 comentario
Shani
Shani el 19 de Nov. de 2013
I am trying to create a neural network, would you have any notes that I can use to aid me with that at all? please

Iniciar sesión para comentar.

Respuesta aceptada

Greg Heath
Greg Heath el 19 de Abr. de 2013
Editada: Greg Heath el 19 de Abr. de 2013
1. If the input/output transformation function is reasonably well behaved, 1 hidden layer is sufficient. The resulting net is a universal approximator.
2. However, if you need a ridiculously high number of hidden nodes, H, ( especially if the number of unknown weights Nw = (I+1)*H+(H+1)*O approaches or exceeds the number of training equations Ntrneq = Ntrn*O), you can reduce the total number of nodes by introducing a second hidden layer.
[ I Ntrn ] = size(trninput)
[ O Ntrn ] = size(trntarget)
3. Nevertheless, it is usually better to stick with 1 hidden layer and use a validation stopping subset (the default) and/or a regularized objective function (an option of mse: help mse) or a regularization training function (help trainbr)
4. Sometimes a ridiculously high number of weights is the result of using a ridiculously high number of inputs. So, it may be worthwhile to consider input subset selection before resorting to a second hidden layer.
For a single hidden layer
weights = getwb(net)
= [ Iw(:); b1(:); Lw(:); b2(:) ]
where
Iw = cell2net(net.IW)
b1 = cell2mat(net.b(1))
Lw = cell2mat(net.Lw)
b2 = cell2mat(net.b(2))
You can try an example if you want to see how getwb orders weights with 2 hidden layers.
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 comentarios
Sai Kumar Dwivedi
Sai Kumar Dwivedi el 18 de Mzo. de 2015
The explanation you gave on your 2nd point
(I+1)*H+(H+1)*O < Ntrn*O
Is this some kind of heuristic or does it have mathematical background?
Aswany
Aswany el 6 de Dic. de 2023
what about a neural network with 7 number of hidden layers?

Iniciar sesión para comentar.

Más respuestas (2)

Manu R
Manu R el 6 de Mzo. de 2011
Editada: John Kelly el 19 de Nov. de 2013
Neural net objects in MATLAB have fields you can access to determine layer weights and biases.
Suppose:
mynet = feedforwardnet % Just a toy example, without any training
weights = mynet.LW
biases = mynet.b
% weight and bias values:
%
% IW: {2x1 cell} containing 1 input weight matrix
% LW: {2x2 cell} containing 1 layer weight matrix
% b: {2x1 cell} containing 2 bias vectors
  2 comentarios
Kai Hao Then
Kai Hao Then el 27 de Mzo. de 2011
There should be numbers showing the weights and bias. The link you have shown me doesn't provide the way to show the weights and biases but the information only. Is there any other way?
Rahul
Rahul el 18 de Abr. de 2013
Display "weights" in this example and it will show numbers.

Iniciar sesión para comentar.


Rahul
Rahul el 18 de Abr. de 2013
or open "weights" in the Workspace.

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by