Text Analytics Toolbox Model for BERT-Small Network

Pretrained BERT-Small Network for MATLAB.
88 Descargas
Actualizado 25 nov 2025
BERT-Small is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 4 self-attention layers and a hidden size of 512.
To load a BERT-Small model, you can run the following code:
[net, tokenizer] = bert(Model="small");
Compatibilidad con la versión de MATLAB
Se creó con R2023b
Compatible con cualquier versión desde R2023b hasta R2026a
Compatibilidad con las plataformas
Windows macOS (Apple Silicon) macOS (Intel) Linux