blaze999
3 models • 1 total models in database
Sort by:
Medical-NER
This model is a fine-tuned version of DeBERTa on the PubMED Dataset. Medical NER Model finetuned on BERT to recognize 41 Medical entities. The following hyperparameters were used during training: - learningrate: 2e-05 - trainbatchsize: 8 - evalbatchsize: 16 - seed: 42 - gradientaccumulationsteps: 2 - totaltrainbatchsize: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lrschedulertype: cosine - lrschedulerwarmupratio: 0.1 - numepochs: 30 - mixedprecisiontraining: Native AMP Usage The easiest way is to load the inference api from huggingface and second method is through the pipeline object offered by transformers library. - Transformers 4.37.0 - Pytorch 2.1.2 - Datasets 2.1.0 - Tokenizers 0.15.1
license:mit
87,735
220
clinical-ner
license:mit
176
1
pii-ner
license:mit
3
0