GroNLP

26 models • 1 total models in database
Sort by:

bert-base-dutch-cased

3,878
33

gpt2-small-dutch

3,161
6

hateBERT

Tommaso Caselli • Valerio Basile • Jelena Mitrovic • Michael Granizter HateBERT is an English pre-trained BERT model obtained by further training the English BERT base uncased model with more than 1 million posts from banned communites from Reddit. The model has been developed as a collaboration between the University of Groningen, the university of Turin, and the University of Passau. For details, check out the paper presented at WOAH 2021. The code and the fine-tuned models are available on OSF. Fine-tuned models have a different licence, check the dedicated repository.

license:apache-2.0
2,960
38

wav2vec2-dutch-large-ft-cgn

683
1

gpt2-small-italian

606
13

mdebertav3-subjectivity-english

393
1

gpt2-medium-italian-embeddings

89
3

gpt2-small-italian-embeddings

68
1

mdebertav3-subjectivity-multilingual

33
2

gpt2-small-dutch-embeddings

30
2

mdebertav3-subjectivity-italian

23
0

wav2vec2-large-xlsr-53-ft-cgn

20
3

bert_dutch_base_offensive_language

license:apache-2.0
19
1

bert-base-dutch-cased-upos-alpino

14
1

bert-base-dutch-cased-frisian

11
1

bert-base-dutch-cased-gronings

11
1

gpt2-medium-dutch-embeddings

7
3

bert-base-dutch-cased-upos-alpino-frisian

7
0

bert-base-dutch-cased-upos-alpino-gronings

7
0

mdebertav3-subjectivity-dutch

5
0

mdebertav3-subjectivity-german

3
1

bert_dutch_base_abusive_language

license:apache-2.0
2
0

mdebertav3-subjectivity-turkish

2
0

mdebertav3-subjectivity-arabic

2
0

T0pp-sharded

license:apache-2.0
1
5

wav2vec2-dutch-large

0
2