akdeniz27
bert-base-turkish-cased-ner
This model is the fine-tuned model of "dbmdz/bert-base-turkish-cased" using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.9933935699477056 f1: 0.9592969472710453 precision: 0.9543530277931161 recall: 0.9642923563325274 Evaluation results with the test sets proposed in "Küçük, D., Küçük, D., Arıcı, N. 2016. Türkçe Varlık İsmi Tanıma için bir Veri Kümesi ("A Named Entity Recognition Dataset for Turkish"). IEEE Sinyal İşleme, İletişim ve Uygulamaları Kurultayı. Zonguldak, Türkiye." paper. Test Set Acc. Prec. Rec. F1-Score 20010000 0.9946 0.9871 0.9463 0.9662 20020000 0.9928 0.9134 0.9206 0.9170 20030000 0.9942 0.9814 0.9186 0.9489 20040000 0.9943 0.9660 0.9522 0.9590 20050000 0.9971 0.9539 0.9932 0.9732 20060000 0.9993 0.9942 0.9942 0.9942 20070000 0.9970 0.9806 0.9439 0.9619 20080000 0.9988 0.9821 0.9649 0.9735 20090000 0.9977 0.9891 0.9479 0.9681 20100000 0.9961 0.9684 0.9293 0.9485 Overall 0.9961 0.9720 0.9516 0.9617
mbert-base-albanian-cased-ner
roberta-base-cuad
bert-turkish-text-classification
Xlm Roberta Base Turkish Ner
Turkish Named Entity Recognition (NER) Model This model is the fine-tuned version of "xlm-roberta-base" (a multilingual version of RoBERTa) using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Fine-tuning parameters: Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.9919343118732742 f1: 0.9492100796448622 precision: 0.9407349896480332 recall: 0.9578392621870883
llama-2-7b-hf-qlora-dolly15k-turkish
convbert-base-turkish-cased-ner
mDeBERTa-v3-base-turkish-ner
LunarLander-v1
dqn-SpaceInvadersNoFrameskip-v4
bert-base-turkish-cased-ner-lora
ppo-SnowballTarget
deberta-v2-xlarge-cuad
ppo-Pyramids
modernbert-base-tr-uncased-ner
This model is the fine-tuned model of "artiwise-ai/modernbert-base-tr-uncased" using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.9910922551637875 f1: 0.9323197128075177 precision: 0.9292780467270049 recall: 0.9353813559322034
ppo-Huggy
bert-base-hungarian-cased-ner
roberta-large-cuad
bert-base-turkish-cased-ner-quantized
a2c-AntBulletEnv-v0
a2c-PandaReachDense-v2
poca-SoccerTwos
mmbert-base-tr-uncased-ner
This model is the fine-tuned version of Multilingual ModernBERT model "jhu-clsp/mmBERT-base" using a reviewed version of well known Turkish NER dataset (https://github.com/stefan-it/turkish-bert/files/4558187/nerdata.txt). Pls refer "https://huggingface.co/transformers/modules/transformers/pipelines/tokenclassification.html" for entity grouping with aggregationstrategy parameter. Reference test results: accuracy: 0.991023766617932 f1: 0.9414858645627877 precision: 0.9397695785328861 recall: 0.9432084309133489