lang-uk
20 models • 1 total models in database
Sort by:
ukr-paraphrase-multilingual-mpnet-base
license:apache-2.0
3,261
12
uk_ner_wechsel_minixhofer_roberta_large
license:mit
214
0
OmniGEC-Minimal-8B
NaNK
license:mit
88
3
electra-base-ukrainian-cased-discriminator
license:mit
26
3
flair-uk-ner
license:mit
20
3
dragoman
NaNK
license:apache-2.0
16
12
flair-uk-pos
license:mit
7
1
electra-base-ukrainian-cased-generator
license:mit
6
1
OmniGEC-Minimal-12B
NaNK
license:mit
2
1
electra-base-ukrainian-v2-cased-discriminator
license:mit
0
6
electra-base-ukrainian-v2-dbmdz-vocab-cased-discriminator
license:mit
0
5
flair-uk-forward
license:mit
0
3
fasttext_uk
license:mit
0
2
OmniGEC-Fluency-8B
NaNK
license:mit
0
2
OmniGEC-Fluency-12B
NaNK
license:mit
0
2
Ukr Clip Vit H 14 Frozen Xlm Roberta Large Laion5B S13B B90k
NaNK
license:mit
0
2
flair-uk-backward
license:mit
0
1
fasttext_uk_cbow
license:mit
0
1
dragoman-4bit
NaNK
license:apache-2.0
0
1
roberta-large-ner-uk
A transformer-based NER model for Ukrainian, trained on a combination of human-annotated data (NER-UK 2.0) and high-quality silver-standard annotations (UberText-NER-Silver). Based on `roberta-large-NER`, this model achieves state-of-the-art performance on a wide range of named entities in Ukrainian. - Model type: Transformer-based encoder (spaCy pipeline) - Language (NLP): Ukrainian - License: Apache 2.0 - Finetuned from model: `51la5/roberta-large-NER` - Entity Types (13): `PERS`, `ORG`, `LOC`, `DATE`, `TIME`, `JOB`, `MON`, `PCT`, `PERIOD`, `DOC`, `QUANT`, `ART`, `MISC`
license:apache-2.0
0
1