how can I make a deep learning custom layer to take multiple inputs like addition/concatenation?
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I am inplementating attention mechanism for image segmentation, one step in attention is to take dot product of the attention coefficient and input feature map,however, matlab neural network toolbox doesn't support such an action yet. What I tried was to modify the "Addition layer" and relavant functions to make a new class/functions to handle dot product of two inputs and I also added those functions to the path. It kind of works so it does generate an "attention layer" and it can take two inputs, however, when I train the network, it gets error saying Error using nnet.internal.cnn.analyzer.NetworkAnalyzer/propagateSizes (line 223): Index exceeds array bounds.
2 Kommentare
Markus Wagner
am 2 Jan. 2019
I have the same problem. I want implementate a custome hidden layer and a custome regression layer with 2 inputs like the addition/concatenation layer for bulid up a VAE Network.
After adapt the class of the Addition Layer and use in a layerGraph it failed by a Error:
Error using nnet.internal.cnn.util.validateLayersForLayerGraph (line 20)
Expected input to be one of these types:
nnet.cnn.layer.Layer
Instead its type was latentLayer.
Hanxx Sakura
am 17 Jan. 2019
Bearbeitet: Hanxx Sakura
am 17 Jan. 2019
I think there is no official way to implement module with multiple inputs, but I finally accomplish it inspired by the Addition.m and AdditionLayer.m as follows:
- implement an internal layer (e.g., myAdd) like the "Addition" Class, such as, defining the variables, forward / backforward function;
- since the internal layer cannot be used in the layerGraph directly, wrap the internal layer by an external class (e.g., myAddLayer) as in "AdditionLayer.m";
- create an object of your module by using myAddLayer();
hope this can help~
Antworten (1)
Maksym Tymchenko
am 10 Mär. 2023
Since release R2022b, MATLAB directly supports the attention mechanism that you are trying to implement.
Check the documentation of the attention function at the link below:
0 Kommentare
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!