How to obtain word embedding vector for each word in the sentence using pre-trained BERT in MATLAB

7 Ansichten (letzte 30 Tage)
Hello,
I have a question on How to obtain word embedding vector for each word in the sentence using pre-trained BERT in MATLAB. I successfully loaded bert and tokenized the words in the sentence, but I didn't find example code in MathWork website to get each word's embedding vector, like word2vec.
[net,tokenizer] = bert;
str = "Bidirectional Encoder Representations from Transformers";
words = wordTokenize(tokenizer,str)
% Then what...?
I would thank you if anyone can help this.

Antworten (1)

Ganesh
Ganesh am 31 Dez. 2023
I understand that you want to generate Word Embeddings for BERT Model using MATLAB. To achieve this, you can use the "encode()" function, implemented similar to your own implementation.
[net,tokenizer] = bert;
str = "Bidirectional Encoder Representations from Transformers";
words = encode(tokenizer,str);
In case of BERT, "embedding" and "encoding" can be used interchangeably. Further, you can use the "decode()" function to decode the "encodings".
Kindly refer to the documentation below to know more on these functions:
Kindly note that using "bert" model in MATLAB requires the Text Analytics Toolbox.
Hope this helps

Kategorien

Mehr zu Modeling and Prediction finden Sie in Help Center und File Exchange

Produkte


Version

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by