What does the iteration count exactly mean when training neural networks?

86 visualizaciones (últimos 30 días)
i am using newff for training neural networks. My input data is an array size of 6X2000, meaning 2000 samples of 6-parameter inputs. My output is size 81X2000 meaning 2000 samples of 81 parameter output data. When I start training, the neural network toolbox automatially sets the iteration count limit to 1000. Does the 1000 iteration count mean it is training the network with the same data 1000 times?

Respuestas (1)

Srivardhan Gadila
Srivardhan Gadila el 13 de Feb. de 2020
Editada: Srivardhan Gadila el 13 de Feb. de 2020
Iterations are calculated based on the values of MiniBatchSize, epochs mentioned in the trainingOptions and the number of training samples.
An iteration is one step taken in the gradient descent algorithm towards minimizing the loss function using a mini-batch. An epoch is the full pass of the training algorithm over the entire training set.
Iterations per epoch = Number of training samples ÷ MiniBatchSize i.e., In how many iterations in a epoch the forward and backward pass takes place during training the network.
Iterations = Iterations per epoch * Number of epochs
If number of epochs is = n then it means that the network is trained with same data n times.

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Productos


Versión

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by