Example of using Self attention layer in MATLAB R2023A

272 Ansichten (letzte 30 Tage)
MAHMOUD EID
MAHMOUD EID am 21 Mär. 2023
Kommentiert: DGM am 5 Mär. 2024
IN MATLAB 2023A, self-attention layer is intorduced.
can an example is provided to use it in image classication tasks?
  2 Kommentare
Kuo
Kuo am 7 Jul. 2023
Same question, can there be an example about time series forecasting? Thanks !!

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Himanshu
Himanshu am 29 Mär. 2023
Hi Mahmoud,
I understand that you want to use "selfAttentionLayer" for image classification task in MATLAB.
A self-attention layer computes single-head or multihead self-attention of its input. For the following example, we will be using the "DigitDataset" in MATLAB.
% load digit dataset
digitDatasetPath = fullfile(matlabroot, 'toolbox', 'nnet', 'nndemos', 'nndatasets', 'DigitDataset');
imds = imageDatastore(digitDatasetPath, ...
'IncludeSubfolders', true, 'LabelSource', 'foldernames');
[imdsTrain, imdsValidation] = splitEachLabel(imds, 0.7, 'randomized');
% define network architecture
layers = [
imageInputLayer([28 28 1], 'Name', 'input')
convolution2dLayer(3, 32, 'Padding', 'same', 'Name', 'conv1')
batchNormalizationLayer('Name', 'bn1')
reluLayer('Name', 'relu1')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool1')
convolution2dLayer(3, 64, 'Padding', 'same', 'Name', 'conv2')
batchNormalizationLayer('Name', 'bn2')
reluLayer('Name', 'relu2')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool2')
flattenLayer('Name', 'flatten')
selfAttentionLayer(8, 64, 'Name', 'self_attention')
fullyConnectedLayer(10, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'output')]
% set training options
options = trainingOptions('sgdm', ...
'InitialLearnRate', 0.01, ...
'MaxEpochs', 5, ...
'Shuffle', 'every-epoch', ...
'ValidationData', imdsValidation, ...
'ValidationFrequency', 30, ...
'Verbose', false, ...
'Plots', 'training-progress')
% training the network
net = trainNetwork(imdsTrain, layers, options);
Training Output:
In this code, the selfAttentionLayer is used to processes 28x28 grayscale images. The self-attention mechanism helps the model capture long-range dependencies in the input data, meaning it can learn to relate different parts of the image to each other. By introducing the selfAttentionLayer after a series of convolutional and pooling layers, the model can enhance its feature representation capabilities by considering spatial relationships between different regions of the input image.
You can refer to the below documentation to understand more about creating and training a simple convolutional neural network for deep learning classification.
  5 Kommentare
cui,xingxing
cui,xingxing am 5 Jan. 2024
@Muhammad Shoaib ,@Himanshu I have tryed use selfAttentionLayer in time sequence data in R2023b,but faild! please see follow link, is there any idea?
DGM
DGM am 5 Mär. 2024
Posted as a comment-as-flag by chang gao:
Useful answer.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu AI for Signals finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by