Are there any options to resize/replicate the matrices/vectors between layers of a deep network?
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
In a deep learning network, I have two branches operating from same input layer. In one of the branches, I have fully-connected layer, say 1X1XN dimensions. In another branch, I have a convolutional layer giving two-dimensional matrix, say PXQXS. In order to proceed with further convolutions by combining them, I have to concatenate the outputs of these branches by repeating the N-dimensional vector to form PXQXN, so that I will get a PXQX(N+S) matrix. To do this, are there any means to replicate a vector to matrix, analogous to 'repmat()' function, in between deep network layers?
In other words, is there any means by which I can concatenate two layers of different width and height by bringing them to a common size in a deep network?
0 Kommentare
Antworten (1)
Delprat Sebastien
am 25 Jan. 2020
I did a custom reshape layer for that purpose. Read the custom layer doc, it is very simple.there is however a very big limitation. Custom layers cannot change the dlarray format. That means that it is necessary to have a conv layer between the fully connected (output format is SB) and your reshape layer. The conv layer will output a (SSCB) so you can reshape it.
Source:mathworks support
0 Kommentare
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!