bhadresh-savani
distilbert-base-uncased-emotion
roberta-base-emotion
bert-base-uncased-emotion
Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective bert-base-uncased finetuned on the emotion dataset using HuggingFace Trainer with below training parameters Model Performance Comparision on Emotion Dataset from Twitter: | Model | Accuracy | F1 Score | Test Sample per Second | | --- | --- | --- | --- | | Distilbert-base-uncased-emotion | 93.8 | 93.79 | 398.69 | | Bert-base-uncased-emotion | 94.05 | 94.06 | 190.152 | | Roberta-base-emotion | 93.95 | 93.97| 195.639 | | Albert-base-v2-emotion | 93.6 | 93.65 | 182.794 | Training procedure Colab Notebook follow the above notebook by changing the model name from distilbert to bert Reference: Natural Language Processing with Transformer By Lewis Tunstall, Leandro von Werra, Thomas Wolf