Complete transformer model (Encoder + Decoder + Interconections)

Hello
I am wondering if there is already a Matlab keyboard warrior that has coded (on Matlab) a full transformer model:
  1. Inputs: Input Embedding + Positional Encoding
  2. Encoder: Multihead Attention + Add & Normalisation + Feedforward + Add & Normatisation
  3. Outputs: Output Embedding + Positional Encoding
  4. Decoder: Masked Multihead Attention + Add & Normalisation + Multihead Attention + Add & Normalisation + Feedforward + Add & Normatisation
  5. Final: Linear and Softmax.
Including all the interconnections between them.
Thank you
Will

Antworten (1)

Yash Sharma
Yash Sharma am 5 Aug. 2024

0 Stimmen

Hi Will,
You can take a look at the following file exchange submission.

2 Kommentare

Hello Yash
Thank you for your answer.
I read that one, it is based on a pre-trained transformer and it does not directly represent the transformer components. As well it provides the same functionality as a normal LSTM for text classification.
It is acknowledged transformers with attention are somehow superior to Deep Learning based on LSTM, however, I have yet to prove it myself.
Thank you
Will
As it seems nobody has answered, I have cracked the code myself.

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Specialized Power Systems finden Sie in Hilfe-Center und File Exchange

Gefragt:

am 2 Aug. 2024

Kommentiert:

am 5 Okt. 2024

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by