Seyfelislem
Whisper Medium Arabic
This model is a fine-tuned version of openai/whisper-medium on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2194 - Wer: 18.2888 The following hyperparameters were used during training: - learningrate: 1e-05 - trainbatchsize: 2 - evalbatchsize: 8 - seed: 42 - gradientaccumulationsteps: 16 - totaltrainbatchsize: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lrschedulertype: linear - lrschedulerwarmupsteps: 500 - trainingsteps: 800 - mixedprecisiontraining: Native AMP | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 0.3327 | 1.0 | 800 | 0.2194 | 18.2888 | - Transformers 4.27.0.dev0 - Pytorch 1.13.0 - Datasets 2.10.2.dev0 - Tokenizers 0.13.2