How I customize self attention layer for identifying wafer defects?

13 visualizaciones (últimos 30 días)
how I used customize multi head self attention in the CNN network for detecting wafer defects ? please explain with example

Respuesta aceptada

Shantanu Dixit
Shantanu Dixit el 15 de Jul. de 2024
Hi Sharith,
It is my understanding that you want to add and customize self-attention in the CNN network for detecting wafer defects.
You can define a CNN-based architecture and add a self-attention layer in the end using ‘selfAttentionLayer’. The function takes in two parameters, i.e, ‘NumHeads’ and ‘NumKeyChannels’ using which you can change the number of heads and the dimensions of key vector.
Below is a reference code for the model architecture:
layers = [
imageInputLayer([28 28 1], 'Name', 'input')
convolution2dLayer(3, 16, 'Padding', 'same', 'Name', 'conv1')
batchNormalizationLayer('Name', 'bn1')
reluLayer('Name', 'relu1')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool1')
convolution2dLayer(3, 32, 'Padding', 'same', 'Name', 'conv2')
batchNormalizationLayer('Name', 'bn2')
reluLayer('Name', 'relu2')
flattenLayer('Name', 'flatten')
selfAttentionLayer(4, 32, 'Name', 'self_attention')
fullyConnectedLayer(10, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'output')
];
The above code defines a CNN based architecture incorporating Multi headed self-attention (MHSA) for ten class classification.
Refer to the below MathWorks documentation for more information:
  1 comentario
Sharith Dhar
Sharith Dhar el 15 de Jul. de 2024
Thanks for response , but i want to modify self attention layer properties QueryWeights, KeyWeights, ValueWeights, OutputWeight in that case what is the MATLAB code?

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by