akdeniz27

23 models • 1 total models in database
Sort by:

bert-base-turkish-cased-ner

This model is the fine-tuned model of "dbmdz/bert-base-turkish-cased" using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.9933935699477056 f1: 0.9592969472710453 precision: 0.9543530277931161 recall: 0.9642923563325274 Evaluation results with the test sets proposed in "Küçük, D., Küçük, D., Arıcı, N. 2016. Türkçe Varlık İsmi Tanıma için bir Veri Kümesi ("A Named Entity Recognition Dataset for Turkish"). IEEE Sinyal İşleme, İletişim ve Uygulamaları Kurultayı. Zonguldak, Türkiye." paper. Test Set Acc. Prec. Rec. F1-Score 20010000 0.9946 0.9871 0.9463 0.9662 20020000 0.9928 0.9134 0.9206 0.9170 20030000 0.9942 0.9814 0.9186 0.9489 20040000 0.9943 0.9660 0.9522 0.9590 20050000 0.9971 0.9539 0.9932 0.9732 20060000 0.9993 0.9942 0.9942 0.9942 20070000 0.9970 0.9806 0.9439 0.9619 20080000 0.9988 0.9821 0.9649 0.9735 20090000 0.9977 0.9891 0.9479 0.9681 20100000 0.9961 0.9684 0.9293 0.9485 Overall 0.9961 0.9720 0.9516 0.9617

license:mit
89,871
26

mbert-base-albanian-cased-ner

license:mit
1,230
2

roberta-base-cuad

44
0

bert-turkish-text-classification

20
0

Xlm Roberta Base Turkish Ner

Turkish Named Entity Recognition (NER) Model This model is the fine-tuned version of "xlm-roberta-base" (a multilingual version of RoBERTa) using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Fine-tuning parameters: Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.9919343118732742 f1: 0.9492100796448622 precision: 0.9407349896480332 recall: 0.9578392621870883

license:mit
19
7

llama-2-7b-hf-qlora-dolly15k-turkish

NaNK
base_model:meta-llama/Llama-2-7b-hf
17
8

convbert-base-turkish-cased-ner

15
3

mDeBERTa-v3-base-turkish-ner

license:mit
14
1

LunarLander-v1

14
0

dqn-SpaceInvadersNoFrameskip-v4

12
0

bert-base-turkish-cased-ner-lora

license:mit
8
2

ppo-SnowballTarget

7
0

deberta-v2-xlarge-cuad

5
2

ppo-Pyramids

5
0

modernbert-base-tr-uncased-ner

This model is the fine-tuned model of "artiwise-ai/modernbert-base-tr-uncased" using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.9910922551637875 f1: 0.9323197128075177 precision: 0.9292780467270049 recall: 0.9353813559322034

license:mit
4
3

ppo-Huggy

4
0

bert-base-hungarian-cased-ner

license:mit
2
3

roberta-large-cuad

2
3

bert-base-turkish-cased-ner-quantized

license:mit
1
1

a2c-AntBulletEnv-v0

1
0

a2c-PandaReachDense-v2

1
0

poca-SoccerTwos

1
0

mmbert-base-tr-uncased-ner

This model is the fine-tuned version of Multilingual ModernBERT model "jhu-clsp/mmBERT-base" using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.991023766617932 f1: 0.9414858645627877 precision: 0.9397695785328861 recall: 0.9432084309133489

license:mit
1
0