joeddav

4 models • 1 total models in database
Sort by:

xlm-roberta-large-xnli

--- language: - multilingual - en - fr - es - de - el - bg - ru - tr - ar - vi - th - zh - hi - sw - ur tags: - text-classification - pytorch - tensorflow datasets: - multi_nli - xnli license: mit pipeline_tag: zero-shot-classification widget: - text: "За кого вы голосуете в 2020 году?" candidate_labels: "politique étrangère, Europe, élections, affaires, politique" multi_class: true - text: "لمن تصوت في 2020؟" candidate_labels: "السياسة الخارجية, أوروبا, الانتخابات, الأعمال, السياسة" multi_class

license:mit
190,114
265

bart-large-mnli-yahoo-answers

--- language: en license: apache-2.0 tags: - text-classification - pytorch datasets: - yahoo-answers pipeline_tag: zero-shot-classification base_model: - facebook/bart-large-mnli ---

license:apache-2.0
177,629
12

distilbert-base-uncased-go-emotions-student

This model is distilled from the zero-shot classification pipeline on the unlabeled GoEmotions dataset using this script. It was trained with mixed precision for 10 epochs and otherwise used the default script arguments. The model can be used like any other model trained on GoEmotions, but will likely not perform as well as a model trained with full supervision. It is primarily intended as a demo of how an expensive NLI-based zero-shot model can be distilled to a more efficient student, allowing a classifier to be trained with only unlabeled data. Note that although the GoEmotions dataset allow multiple labels per instance, the teacher used single-label classification to create psuedo-labels.

license:mit
9,594
83

distilbert-base-uncased-agnews-student

license:mit
17
5