MATLAB Dropout layer during prediciton

14 visualizaciones (últimos 30 días)
rakbar
rakbar el 7 de En. de 2019
Comentada: Miles Brim el 11 de Mayo de 2022
The Documentation for a Dropout layer states that:
"At prediction time the output of a dropout layer is equal to its input."
I assume this means that during prediction, there is no dropout.
Is there a method in MATLAB to enable Dropout during prediction time?
  1 comentario
Michael Phillips
Michael Phillips el 12 de Mzo. de 2021
Has there been any progress on this in recent Matlab updates? I would also like to make an uncertainty assessment of my network using the Monte Carlo dropout method proposed by Gal and Ghahramani in 2016.

Iniciar sesión para comentar.

Respuestas (5)

Michael Hesse
Michael Hesse el 18 de Nov. de 2020
Any updates for this topic? I'm also interested in using dropout at prediction time to make an estimate for the uncertainty of the network. Yarin Gal showed that this procedure is equivalent to bayesian approximate inference.

Michael Hesse
Michael Hesse el 18 de Nov. de 2020
classdef Dropout < nnet.internal.cnn.layer.FunctionalLayer ...
& nnet.internal.cnn.layer.CPUFusableLayer
% Dropout Implementation of the dropout layer
% Copyright 2015-2019 The MathWorks, Inc.
properties
% LearnableParameters Learnable parameters for the layer
% This layer has no learnable parameters.
LearnableParameters = nnet.internal.cnn.layer.learnable.PredictionLearnableParameter.empty();
% Name (char array) A name for the layer
Name
end
properties (Constant)
% DefaultName Default layer's name.
DefaultName = 'dropout'
end
properties
% Learnables Empty
Learnables
end
properties(SetAccess=protected, GetAccess=?nnet.internal.cnn.dlnetwork)
% LearnablesName Empty
LearnablesNames
end
properties (SetAccess = private)
% InputNames This layer has a single input
InputNames = {'in'}
% OutputNames This layer has a single output
OutputNames = {'out'}
% HasSizeDetermined Specifies if all size parameters are set
% For this layer, there are no size parameters to set.
HasSizeDetermined = true
% Fraction The proportion of neurons to drop
% A number between 0 and 1 which specifies the proportion of
% input elements that are dropped by the dropout layer.
Probability
end
methods
function this = Dropout(name, probability)
this.Name = name;
this.Probability = probability;
% Dropout layer doesn't need X or Z for the backward pass
this.NeedsXForBackward = false;
this.NeedsZForBackward = false;
end
function Z = predict(~, X)
Z = X;
end
function [Z, dropoutMask] = forward(this, X)
% Use "inverted dropout", where we use scaling at training time
% so that we don't have to scale at test time. The scaled
% dropout mask is returned as the variable "dropoutMask".
if ~isa(X, 'dlarray')
superfloatOfX = superiorfloat(X);
else
superfloatOfX = superiorfloat(extractdata(X));
end
dropoutScaleFactor = cast( 1 - this.Probability, superfloatOfX );
dropoutMask = ( rand(size(X), 'like', X) > this.Probability ) / dropoutScaleFactor;
Z = X.*dropoutMask;
end
function [dX,dW] = backward(~, ~, ~, dZ, mask)
dX = dZ.*mask;
dW = []; % No learnable parameters
end
function outputSize = forwardPropagateSize(~, inputSize)
outputSize = inputSize;
end
function this = inferSize(this, ~)
end
function tf = isValidInputSize(~, ~)
% isValidInputSize Check if the layer can accept an input of
% a certain size
tf = true;
end
function outputSeqLen = forwardPropagateSequenceLength(~, inputSeqLen, ~)
% forwardPropagateSequenceLength The sequence length of the
% output of the layer given an input sequence length
% Propagate arbitrary sequence length
outputSeqLen = inputSeqLen;
end
function this = initializeLearnableParameters(this, ~)
end
function this = prepareForTraining(this)
this.LearnableParameters = nnet.internal.cnn.layer.learnable.TrainingLearnableParameter.empty();
end
function this = prepareForPrediction(this)
this.LearnableParameters = nnet.internal.cnn.layer.learnable.PredictionLearnableParameter.empty();
end
function this = setupForHostPrediction(this)
end
function this = setupForGPUPrediction(this)
end
function this = setupForHostTraining(this)
end
function this = setupForGPUTraining(this)
end
end
methods(Access=protected)
function this = setFunctionalStrategy(this)
% No-op
end
end
methods (Hidden)
function layerArgs = getFusedArguments(~)
% getFusedArguments Returned the arguments needed to call the
% layer in a fused network.
layerArgs = { 'passthrough' };
end
function tf = isFusable(~, ~, ~)
% isFusable Indicates if the layer is fusable in a given network.
tf = true;
end
end
end
How can i copy the forward method into the predict method?
  2 comentarios
Michael Phillips
Michael Phillips el 12 de Mzo. de 2021
Hey Michael, I'm trying to solve the same problem - that is, I want to use dropout during testing to estimate my network uncertainty. Did you ever get a version of this code working? If so, would you be willing to share it with me? Thanks!
Miles Brim
Miles Brim el 11 de Mayo de 2022
I am also interested in this

Iniciar sesión para comentar.


Vishal Bhutani
Vishal Bhutani el 10 de En. de 2019
Based on my understanding dropout layer is used to avoid over-fitting of the neural network. The term "dropout" refers to dropping out units (both hidden and visible) in a neural network. This type of functionality is required at time of training of network. At the time of testing whole network is considered i.e all weights are accountable. So during testing or prediction output of dropout layer is equal to its input.
Its better if you tell your usecase, it might help to understand issue in more detail.
Hope it helps.
  1 comentario
rakbar
rakbar el 10 de En. de 2019
Some recent studies and tests have shown that when the Dropout layer is also active during prediction times, the prediction interval (or confidence interval) of the target can also be estimated. See for Example:
"A Theoretically Grounded Application of Dropout in Recurrent Neural Networks" https://arxiv.org/abs/1512.05287
At prediction time, the idea is to perform a few Monte-Carlo like loops for each target to obtain a distriution predictions. Then reporting the mean and standard deviation as the final target prediction and its error. These can be done in Python/Keras/Tensorflow.

Iniciar sesión para comentar.


Greg Heath
Greg Heath el 11 de En. de 2019
help dropout
... It is important to note that when creating a network, dropout will only be used during training.
Hope this helps.
Thank you for formally accepting my answer
Greg
I

Don Mathis
Don Mathis el 17 de En. de 2019
You could write yourself a custom dropout layer that does dropout in both the forward() and predict() methods. For dropout rate p, it would set each activation to 0 with probability p and then multiply all activations by 1/(1-p).
I'm not sure, but you might be able to give it a writeable 'p' property so you could set it to 0 after training if you want.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by