deepset

47 models • 2 total models in database
Sort by:

roberta-base-squad2

--- language: en license: cc-by-4.0 datasets: - squad_v2 model-index: - name: deepset/roberta-base-squad2 results: - task: type: question-answering name: Question Answering dataset: name: squad_v2 type: squad_v2 config: squad_v2 split: validation metrics: - type: exact_match value: 79.9309 name: Exact Match verified: true verifyToken: >- eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDhhNjg5YzNiZGQ1YTIyYTAwZGUwOWEzZTRiYzdjM2QzYjA3ZTUxNDM1NjE1MTUyMjE1MGY1YzEzMjRjYzVjYiIsInZlcnNpb24iOjF9.EH5JJo

NaNK
license:cc-by-4.0
700,563
926

gelectra-large

--- language: de license: mit datasets: - wikipedia - OPUS - OpenLegalData - oscar ---

license:mit
248,770
18

bert-large-uncased-whole-word-masking-squad2

--- language: en license: cc-by-4.0 datasets: - squad_v2 model-index: - name: deepset/bert-large-uncased-whole-word-masking-squad2 results: - task: type: question-answering name: Question Answering dataset: name: squad_v2 type: squad_v2 config: squad_v2 split: validation metrics: - type: exact_match value: 80.8846 name: Exact Match verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2E5ZGNkY2ExZWViZGEwNWE3OGRmMWM2ZmE4ZDU4ZDQ1OGM3ZWE0NTVmZjFmYmZjZmJmNjJmYTc3NTM3OTk3OSIsI

license:cc-by-4.0
195,428
31

deberta-v3-base-injection

license:mit
78,835
40

xlm-roberta-base-squad2

license:cc-by-4.0
60,900
25

bert-base-cased-squad2

license:cc-by-4.0
53,791
21

tinyroberta-squad2

license:cc-by-4.0
32,527
111

roberta-base-squad2-distilled

license:mit
24,199
15

gbert-base

license:mit
14,845
42

gbert-large

license:mit
11,828
58

xlm-roberta-large-squad2

Multilingual XLM-RoBERTa large for Extractive QA on various languages Overview Language model: xlm-roberta-large Language: Multilingual Downstream-task: Extractive QA Training data: SQuAD 2.0 Eval data: SQuAD dev set - German MLQA - German XQuAD Training run: MLFlow link Code: See an example extractive QA pipeline built with Haystack Infrastructure: 4x Tesla v100 In Haystack Haystack is an AI orchestration framework to build customizable, production-ready LLM applications. You can use this model in Haystack to do extractive question answering on documents. To load and run the model with Haystack: For a complete example with an extractive question answering pipeline that scales over many documents, check out the corresponding Haystack tutorial. Performance Evaluated on the SQuAD 2.0 English dev set with the official eval script. Evaluated on German MLQA: test-context-de-question-de.json In Haystack For doing QA at scale (i.e. many docs instead of single paragraph), you can load the model also in haystack: Authors Branden Chan: [email protected] Timo Möller: [email protected] Malte Pietsch: [email protected] Tanay Soni: [email protected] deepset is the company behind the production-ready open-source AI framework Haystack. Some of our other work: - Distilled roberta-base-squad2 (aka "tinyroberta-squad2") - German BERT, GermanQuAD and GermanDPR, German embedding model - deepset Cloud, deepset Studio For more info on Haystack, visit our GitHub repo and Documentation . Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube

license:cc-by-4.0
11,470
55

bert-medium-squad2-distilled

license:mit
10,484
3

bert-base-uncased-squad2

Overview Language model: bert-base-uncased Language: English Downstream-task: Extractive QA Training data: SQuAD 2.0 Eval data: SQuAD 2.0 Code: See an example extractive QA pipeline built with Haystack Infrastructure: 1x Tesla v100 In Haystack Haystack is an AI orchestration framework to build customizable, production-ready LLM applications. You can use this model in Haystack to do extractive question answering on documents. To load and run the model with Haystack: For a complete example with an extractive question answering pipeline that scales over many documents, check out the corresponding Haystack tutorial. Authors - Timo Möller: `timo.moeller [at] deepset.ai` - Julian Risch: `julian.risch [at] deepset.ai` - Malte Pietsch: `malte.pietsch [at] deepset.ai` - Michel Bartels: `michel.bartels [at] deepset.ai` deepset is the company behind the production-ready open-source AI framework Haystack. Some of our other work: - Distilled roberta-base-squad2 (aka "tinyroberta-squad2") - German BERT, GermanQuAD and GermanDPR, German embedding model - deepset Cloud - deepset Studio For more info on Haystack, visit our GitHub repo and Documentation . Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube

license:cc-by-4.0
8,539
8

minilm-uncased-squad2

license:cc-by-4.0
7,415
45

roberta-large-squad2

NaNK
license:cc-by-4.0
7,037
28

gelectra-large-germanquad

license:mit
5,975
27

gbert-base-germandpr-question_encoder

license:mit
5,741
5

deberta-v3-base-squad2

NaNK
license:cc-by-4.0
4,194
20

sentence_bert

license:apache-2.0
4,001
21

deberta-v3-large-squad2

license:cc-by-4.0
3,614
51

gelectra-base-germanquad

license:mit
3,439
26

electra-base-squad2

license:cc-by-4.0
1,115
17

bert-base-german-cased-hatespeech-GermEval18Coarse

license:cc-by-4.0
1,090
10

bert-small-mm_retrieval-passage_encoder

486
1

xlm-roberta-base-squad2-distilled

license:mit
472
11

bert-small-mm_retrieval-question_encoder

429
2

bert-small-mm_retrieval-table_encoder

412
0

gelectra-base

license:mit
261
10

gbert-base-germandpr-ctx_encoder

license:mit
167
7

tapas-large-nq-hn-reader

license:apache-2.0
131
2

gbert-base-germandpr-reranking

license:mit
120
6

all-mpnet-base-v2-table

97
7

tinybert-6l-768d-squad2

license:mit
85
1

gelectra-base-generator

license:mit
24
3

gbert-large-sts

license:mit
18
8

flan-t5-xl-squad2

license:cc-by-4.0
16
1

roberta-large-squad2-hp

14
3

bert-base-german-cased-sentiment-Germeval17

13
3

roberta-base-squad2-covid

license:cc-by-4.0
11
5

bert-base-german-cased-oldvocab

license:mit
11
3

tinyroberta-squad2-step1

9
0

tapas-large-nq-reader

license:apache-2.0
7
2

tinyroberta-6l-768d

license:cc-by-4.0
4
3

covid_bert_base

3
6

gelectra-base-germanquad-distilled

license:mit
3
4

quora_dedup_bert_base

license:apache-2.0
3
4

gelectra-large-generator

license:mit
1
2