neuralmind
2 models • 1 total models in database
Sort by:
bert-large-portuguese-cased
--- language: pt license: mit tags: - bert - pytorch datasets: - brWaC ---
license:mit
817,822
69
bert-base-portuguese-cased
BERTimbau Base is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment. It is available in two sizes: Base and Large. For further information or requests, please go to BERTimbau repository. | Model | Arch. | #Layers | #Params | | ---------------------------------------- | ---------- | ------- | ------- | | `neuralmind/bert-base-portuguese-cased` | BERT-Base | 12 | 110M | | `neuralmind/bert-large-portuguese-cased` | BERT-Large | 24 | 335M |
license:mit
108,449
210