Ebtihal
AraDiaBERT_V3
AraBertMo_base_V4
AraBertMo_base_V8
Aurora
AraBertMo Base V9
Arabic BERT Model AraBERTMo is an Arabic pre-trained language model based on Google's BERT architechture. AraBERTMobase uses the same BERT-Base config. AraBERTMobase now comes in 10 new variants All models are available on the `HuggingFace` model page under the Ebtihal name. Checkpoints are available in PyTorch formats. Pretraining Corpus `AraBertMobaseV9' model was pre-trained on ~3 million words: - OSCAR - Arabic version "unshuffleddeduplicatedar". Training results this model achieves the following results: | Task | Num examples | Num Epochs | Batch Size | steps | Wall time | training loss| |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:| | Fill-Mask| 30024| 9 | 64 | 4230 | 7h 57m 42s | 7.3264 | Load Pretrained Model You can use this model by installing `torch` or `tensorflow` and Huggingface library `transformers`. And you can use it directly by initializing it like this: ## This model was built for master's degree research in an organization: - University of kufa. - Faculty of Computer Science and Mathematics. - Department of Computer Science