gigant

20 models • 1 total models in database
Sort by:

romanian-wav2vec2

--- language: - ro license: apache-2.0 tags: - automatic-speech-recognition - hf-asr-leaderboard - robust-speech-event datasets: - mozilla-foundation/common_voice_8_0 - gigant/romanian_speech_synthesis_0_8_1 base_model: facebook/wav2vec2-xls-r-300m model-index: - name: wav2vec2-ro-300m_01 results: - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: Robust Speech Event type: speech-recognition-community-v2/dev_data args: ro metrics: - type: wer value: 46.9

NaNK
license:apache-2.0
357,439
6

whisper-medium-romanian

This model is a fine-tuned version of openai/whisper-medium on the Common Voice 11.0 dataset, and the Romanian speech synthesis corpus. It achieves the following results on the evaluation set: - evalloss: 0.06453 - evalwer: 4.717 - epoch: 7.03 - step: 3500 The architecture is the same as openai/whisper-medium. The model was trained on the Common Voice 11.0 dataset (`train+validation+other` splits) and the Romanian speech synthesis corpus, and was tested on the `test` split of the Common Voice 11.0 dataset. The following hyperparameters were used during training: - learningrate: 1e-05 - trainbatchsize: 32 - evalbatchsize: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lrschedulertype: linear - lrschedulerwarmupsteps: 500 - trainingsteps: 5000 - mixedprecisiontraining: Native AMP - Transformers 4.26.0.dev0 - Pytorch 1.13.0+cu117 - Datasets 2.7.1.dev0 - Tokenizers 0.13.2

NaNK
license:apache-2.0
299
18

SmolLM-500-rawrope

llama
21
0

SmolLM-500-ropescaled

llama
12
0

LunarLander-v2_PPO

7
0

graph_t5_230612

4
0

led_tib

3
0

distilhubert-audio-course-finetuned-gtzan-v5

license:apache-2.0
2
0

SmolLM-135M-scaled-rope-sw

llama
2
0

pegasusx_tib

1
0

whisper-tiny-minds14-audio-course

NaNK
license:apache-2.0
1
0

speecht5_finetuned_voxpopuli_ro_audio_course_v2

license:mit
1
0

whisper-tiny-minds14-audio-course-v2

NaNK
license:apache-2.0
1
0

graphlongt5-dependency-0228

1
0

graphlongt5-dependency-0308

1
0

graphlongt5-structural-0320

1
0

flan-t5fire-small

1
0

SmolLM-135M-ft-500-steps

llama
1
0

SmolLM-135M-rescaled-ft-500-steps

llama
1
0

SmolLM-mc4-500-ropescaled

llama
1
0