Text Analytics Toolbox Model for BERT-Base Multilingual Cased Network

Pretrained BERT-Base Multilingual Cased Network for MATLAB

Ahora está siguiendo esta publicación

BERT-Base Multilingual Cased is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 12 self-attention layers and a hidden size of 768.
To load a BERT-Multilingual Cased model, you can run the following code:
[net, tokenizer] = bert(Model="multilingual");

Compatibilidad con la versión de MATLAB

  • Compatible con cualquier versión desde R2023b hasta R2026a

Compatibilidad con las plataformas

  • Windows
  • macOS (Apple Silicon)
  • macOS (Intel)
  • Linux