ARTeLab
7 models • 1 total models in database
Sort by:
mbart-summarization-mlsum
NaNK
—
1,703
2
it5-summarization-fanpage
—
481
2
mbart-summarization-fanpage
NaNK
—
35
0
it5-summarization-mlsum
—
28
0
it5-summarization-ilpost
—
8
0
mbart-summarization-ilpost
NaNK
—
7
0
It5 Summarization Fanpage 64
This model is a fine-tuned version of gsarti/it5-base on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.4897 - Rouge1: 33.5599 - Rouge2: 15.7432 - Rougel: 25.5253 - Rougelsum: 28.1784 - Gen Len: 57.2958 The following hyperparameters were used during training: - learningrate: 5e-05 - trainbatchsize: 6 - evalbatchsize: 6 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lrschedulertype: linear - numepochs: 4.0 - Transformers 4.12.0.dev0 - Pytorch 1.9.1+cu102 - Datasets 1.12.1 - Tokenizers 0.10.3
—
2
0