Text Analytics Toolbox Model for BERT-Tiny Network

Pretrained BERT-Tiny Network for MATLAB.
41 Downloads
Aktualisiert 20. Mär 2024
BERT-Tiny is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 2 self-attention layers and a hidden size of 128.
To load a BERT-Tiny model, you can run the following code:
[net, tokenizer] = bert(Model="tiny");
Kompatibilität der MATLAB-Version
Erstellt mit R2023b
Kompatibel mit R2023b bis R2024a
Plattform-Kompatibilität
Windows macOS (Apple Silicon) macOS (Intel) Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!