emre

35 models • 1 total models in database
Sort by:

wav2vec2-xls-r-300m-Russian-small

license:apache-2.0
215
2

Whisper Medium Turkish 2

This model is a fine-tuned version of openai/whisper-medium on the Common Voice 11.0 dataset. It achieves the following results on the evaluation set: - Loss: 0.211673 - Wer: 18.51 This model is the openai whisper medium transformer adapted for Turkish audio to text transcription. This model has weight decay set to 0.1 to cope with overfitting. The model is available through its HuggingFace web app Data used for training is the initial 10% of train and validation of Turkish Common Voice 11.0 from Mozilla Foundation. Weight decay showed to have slightly better result also on the evaluation dataset. After loading the pre trained model, it has been trained on the dataset. The following hyperparameters were used during training: - learningrate: 1e-05 - trainbatchsize: 16 - evalbatchsize: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lrschedulertype: linear - lrschedulerwarmupsteps: 500 - trainingsteps: 4000 - mixedprecisiontraining: Native AMP - weightdecay: 0.1 - Transformers 4.26.0.dev0 - Pytorch 1.12.1+cu113 - Datasets 2.7.1 - Tokenizers 0.13.2

NaNK
license:apache-2.0
110
19

turkish-sentiment-analysis

license:apache-2.0
27
6

spanish-dialoGPT

license:mit
22
2

mybankconcept

11
1

llama-2-13b-code-122k

NaNK
llama
10
4

switch-base-8-finetuned-samsum

NaNK
license:apache-2.0
7
0

wav2vec2-large-xls-r-300m-tr

NaNK
license:apache-2.0
6
0

distilbert-base-uncased-finetuned-squad

license:apache-2.0
5
0

wav2vec2-xls-r-300m-ab-CV8

license:apache-2.0
5
0

wav2vec2-large-xlsr-53-W2V2-TATAR-SMALL

NaNK
license:apache-2.0
4
1

distilgpt2-pretrained-tr-10e

license:apache-2.0
4
0

xglm-564M-turkish

NaNK
license:apache-2.0
4
0

gemma-3-12b-it-tr-reasoning40k

NaNK
license:apache-2.0
4
0

detr-resnet-50_finetuned_cppe5

NaNK
license:apache-2.0
3
0

gemma-3-27b-it-tr-reasoning40k-4bi

NaNK
license:apache-2.0
3
0

gemma-2-9b-Turkish-Lora-Continue-Pre-Trained

NaNK
2
5

opus-mt-tr-en-finetuned-en-to-tr

NaNK
license:apache-2.0
2
1

speecht5_tts_tr

2
1

gemma-3-1b-it-tr-reasoning40k

- Developed by: emre - License: apache-2.0 - Finetuned from model : unsloth/gemma-3-1b-it This gemma3text model was trained 2x faster with Unsloth and Huggingface's TRL library.

NaNK
license:apache-2.0
2
1

Qwen-0.5B-GRPO

NaNK
2
0

gemma-3-4b-it-tr-reasoning40k

NaNK
license:apache-2.0
2
0

gemma-2-2b-it-tr-reasoning40k

NaNK
license:apache-2.0
2
0

llama-2-13b-mini

NaNK
llama
1
3

gemma-3-12b-Cont-IT-TR

NaNK
license:apache-2.0
1
2

wav2vec2-xls-r-300m-Br-small

license:apache-2.0
1
0

wav2vec2-xls-r-300m-Tr-med-CommonVoice8

license:apache-2.0
1
0

wav2vec2-xls-r-300m-Turkish-Tr-small-CommonVoice8

license:apache-2.0
1
0

wav2vec2-xls-r-300m-Turkish-Tr-small

license:apache-2.0
1
0

gemma-3-tr-finetuned-it

NaNK
license:apache-2.0
1
0

llama-2-13b-code-chat

NaNK
llama
0
4

java_8m_methods_doc2vec

0
4

gemma-3-27b-it-tr-reasoning40k-4bit

NaNK
0
4

java-RoBERTa-Tara-small

license:apache-2.0
0
2

DeepSeek-R1-Qwen-14B-tr-ORPO

NaNK
0
1