Complete transformer model (Encoder + Decoder + Interconections)

26 Ansichten (letzte 30 Tage)
WIll Serrano
WIll Serrano am 2 Aug. 2024
Kommentiert: WIll Serrano am 5 Okt. 2024
Hello
I am wondering if there is already a Matlab keyboard warrior that has coded (on Matlab) a full transformer model:
  1. Inputs: Input Embedding + Positional Encoding
  2. Encoder: Multihead Attention + Add & Normalisation + Feedforward + Add & Normatisation
  3. Outputs: Output Embedding + Positional Encoding
  4. Decoder: Masked Multihead Attention + Add & Normalisation + Multihead Attention + Add & Normalisation + Feedforward + Add & Normatisation
  5. Final: Linear and Softmax.
Including all the interconnections between them.
Thank you
Will

Antworten (1)

Yash Sharma
Yash Sharma am 5 Aug. 2024
Hi Will,
You can take a look at the following file exchange submission.
  2 Kommentare
WIll Serrano
WIll Serrano am 7 Aug. 2024
Hello Yash
Thank you for your answer.
I read that one, it is based on a pre-trained transformer and it does not directly represent the transformer components. As well it provides the same functionality as a normal LSTM for text classification.
It is acknowledged transformers with attention are somehow superior to Deep Learning based on LSTM, however, I have yet to prove it myself.
Thank you
Will
WIll Serrano
WIll Serrano am 5 Okt. 2024
As it seems nobody has answered, I have cracked the code myself.

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Specialized Power Systems finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by