Xenova

500 models • 25 total models in database
Sort by:

speecht5_tts

--- base_model: microsoft/speecht5_tts library_name: transformers.js pipeline_tag: text-to-speech ---

1,192,804
34

segformer-b0-finetuned-ade-512-512

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Image segmentation with `Xenova/segformer-b0-finetuned-ade-512-512`. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
1,088,619
1

bge-base-en-v1.5

--- base_model: BAAI/bge-base-en-v1.5 library_name: transformers.js license: mit ---

NaNK
license:mit
1,077,550
8

t5-small

--- base_model: t5-small library_name: transformers.js ---

597,808
6

all-MiniLM-L6-v2

--- base_model: sentence-transformers/all-MiniLM-L6-v2 library_name: transformers.js license: apache-2.0 ---

NaNK
license:apache-2.0
430,494
89

whisper-medium

--- base_model: openai/whisper-medium library_name: transformers.js ---

license:apache-2.0
182,368
7

paraphrase-multilingual-MiniLM-L12-v2

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
91,211
12

clip-vit-base-patch32

NaNK
89,155
11

whisper-tiny.en

license:apache-2.0
48,878
23

jina-embeddings-v2-small-en

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

36,035
1

bge-small-en-v1.5

NaNK
33,254
11

whisper-tiny

license:apache-2.0
29,847
10

bge-m3

NaNK
license:mit
28,095
40

whisper-base.en

license:apache-2.0
22,017
1

jina-embeddings-v2-base-en

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

18,413
8

detr-resnet-50

NaNK
18,087
16

clip-vit-base-patch16

NaNK
15,304
9

gte-small

13,336
22

tiny-random-Phi3ForCausalLM

12,709
0

slimsam-77-uniform

license:apache-2.0
11,408
23

distilbert-base-uncased-finetuned-sst-2-english

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: You can then use the model to classify text like this: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

8,417
11

sentence-camembert-large

7,730
0

distilgpt2

NaNK
7,706
9

ms-marco-MiniLM-L-6-v2

NaNK
7,401
4

e5-small-v2

NaNK
7,348
3

modnet

For more information, check out the official repository and example colab. If you haven't already, you can install the Transformers.js JavaScript library from NPM using: You can then use the model for portrait matting, as follows: | Input image | Output mask | |--------|--------| | | | Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

license:apache-2.0
7,292
65

distiluse-base-multilingual-cased-v2

NaNK
6,710
4

all-mpnet-base-v2

NaNK
6,653
4

whisper-base

license:apache-2.0
6,464
8

distilbert-base-cased-distilled-squad

5,940
4

vit-base-patch16-224

NaNK
5,315
2

toxic-bert

5,244
6

nomic-embed-text-v1

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
4,279
3

distilbart-cnn-6-6

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
license:apache-2.0
4,226
9

whisper-small

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

license:apache-2.0
4,175
16

multilingual-e5-small

3,831
8

nllb-200-distilled-600M

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: You can then perform multilingual translation like this: See here for the full list of languages and their corresponding codes. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
license:cc-by-nc-4.0
3,702
50

bert-base-NER

3,635
1

multilingual-e5-base

3,184
1

depth-anything-small-hf

3,115
11

msmarco-distilbert-base-v4

NaNK
3,078
0

gpt2

NaNK
2,943
9

paraphrase-multilingual-mpnet-base-v2

NaNK
2,888
4

musicgen-small

license:cc-by-nc-4.0
2,759
38

gelan-c_all

license:gpl-3.0
2,579
5

vit-gpt2-image-captioning

2,418
25

mobilebert-uncased-mnli

2,219
3

distilbert-base-uncased-mnli

2,095
1

opus-mt-en-es

Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

1,961
3

llama2.c-stories15M

llama
1,870
7

bert-base-multilingual-cased-ner-hrl

1,853
4

ms-marco-TinyBERT-L-2-v2

NaNK
1,667
3

bart-large-mnli

1,666
5

bge-large-en-v1.5

NaNK
1,334
5

bge-reranker-base

1,236
4

e5-base-v2

NaNK
1,188
0

multilingual-e5-large

1,132
11

bert-base-multilingual-uncased-sentiment

1,061
3

whisper-small.en

license:apache-2.0
1,010
4

dino-vits16

NaNK
995
0

GIST Small Embedding V0

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
975
2

speecht5_hifigan

967
0

flan-t5-small

license:apache-2.0
953
5

twitter-roberta-base-sentiment-latest

941
0

LaMini-Flan-T5-783M

909
27

all-MiniLM-L12-v2

NaNK
899
4

distilbert-base-uncased-distilled-squad

886
0

yolos-tiny

861
6

quickdraw-mobilevit-small

778
8

bert-base-uncased

774
1

resnet-50

NaNK
698
0

distilbert-base-uncased

684
1

opus-mt-en-ru

654
0

detr-resnet-50-panoptic

643
1

yolos-small-300

NaNK
618
0

Phi-3-mini-4k-instruct

license:mit
598
21

face-parsing

567
5

LaMini-Flan-T5-248M

544
1

nli-deberta-v3-xsmall

524
1

jina-embeddings-v2-base-zh

license:apache-2.0
514
1

Opus Mt En Zh

Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

494
2

TinyLlama-1.1B-Chat-v1.0

NaNK
llama
480
8

opus-mt-es-en

458
0

M2m100 418M

440
7

opus-mt-fr-en

439
1

distilbart-cnn-12-6

NaNK
431
0

opus-mt-en-fr

405
2

whisper-large

license:apache-2.0
393
3

ms-marco-MiniLM-L-12-v2

NaNK
391
0

sam-vit-base

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Perform mask generation with `Xenova/sam-vit-base`. Next, select the channel with the highest IoU score, which in this case is the second (green) channel. Intersecting this with the original image gives us an isolated version of the subject: We've also got an online demo, which you can try out here. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

license:apache-2.0
384
6

opus-mt-en-id

383
1

bge-small-zh-v1.5

NaNK
347
1

whisper-large-v3

NaNK
license:apache-2.0
346
12

facial_emotions_image_detection

346
6

opus-mt-ar-en

339
2

clip-vit-large-patch14

NaNK
337
1

LaMini-Flan-T5-77M

330
1

siglip-base-patch16-224

NaNK
326
1

gelan-c

license:gpl-3.0
326
0

Qwen1.5-0.5B-Chat

NaNK
317
8

opus-mt-en-de

309
0

opus-mt-hi-en

307
0

opus-mt-sv-en

304
1

flan-t5-base

288
0

UAE-Large-V1

NaNK
license:mit
283
2

bart-large-cnn

277
8

segformer-b2-finetuned-ade-512-512

NaNK
236
0

clap-htsat-unfused

227
0

Phi-3-mini-4k-instruct_fp16

license:mit
224
5

vit-base-patch16-224-in21k

223
4

trocr-small-printed

222
4

whisper-medium.en

license:apache-2.0
214
0

gte-large

189
2

yolov9-c_all

license:gpl-3.0
186
2

distiluse-base-multilingual-cased-v1

NaNK
177
0

moondream2

NaNK
license:apache-2.0
172
31

trocr-base-printed

169
1

nli-deberta-v3-small

167
0

opus-mt-en-ar

166
1

mms-tts-eng

162
0

opus-mt-en-it

161
0

bert-base-NER-uncased

154
0

LaMini-T5-61M

153
1

tiny-random-Florence2ForConditionalGeneration

152
6

Segformer B0 Clothes

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Clothes segmentation with `Xenova/segformerb0clothes`. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

152
2

opus-mt-de-en

152
0

ms-marco-MiniLM-L-4-v2

NaNK
152
0

owlvit-base-patch32

NaNK
147
1

chinese-clip-vit-base-patch16

NaNK
145
2

bge-small-en

144
1

tiny-random-LlavaForConditionalGeneration

license:apache-2.0
144
0

codegen-350M-mono

142
6

sam-vit-large

license:apache-2.0
137
0

SapBERT-from-PubMedBERT-fulltext

136
0

distilbert-base-multilingual-cased-ner-hrl

135
2

segformer-b2-finetuned-cityscapes-1024-1024

NaNK
134
0

tiny-random-WhisperForConditionalGeneration

134
0

Mobileclip S0

If you haven't already, you can install the Transformers.js JavaScript library from NPM using:

130
4

tiny-random-GemmaForCausalLM

license:apache-2.0
125
3

distilbert-base-multilingual-cased-sentiments-student

120
1

dpt-hybrid-midas

119
0

trocr-small-handwritten

107
7

tiny-random-M2M100ForConditionalGeneration

104
0

trocr-base-handwritten

99
3

yolov9-c

license:gpl-3.0
98
6

tiny-random-Wav2Vec2ForCTC-ONNX

license:apache-2.0
98
0

tiny-random-vits

95
0

gpt-neo-125M

94
0

wav2vec2-base-960h

93
3

wavlm-base-plus-sv

90
0

swin2SR-classical-sr-x2-64

NaNK
89
3

LaBSE

82
2

paraphrase-MiniLM-L6-v2

NaNK
82
0

bert-base-multilingual-cased

81
3

t5-base

81
1

bert-base-cased

80
0

DeBERTa-v3-base-mnli

78
0

donut-base-finetuned-docvqa

77
17

whisper-large-v2

NaNK
76
8

finbert

69
2

LaMini-GPT-774M

69
0

mt5-small

68
1

multi-qa-MiniLM-L6-cos-v1

NaNK
66
3

sponsorblock-small

66
2

opus-mt-it-en

65
0

mbart-large-50-many-to-many-mmt

64
6

tiny_starcoder_py

64
2

bert-base-multilingual-uncased

61
1

opus-mt-ru-en

59
0

larger_clap_music_and_speech

58
2

multi-qa-mpnet-base-dot-v1

NaNK
57
1

opus-mt-ko-en

56
0

depth-anything-base-hf

56
0

opus-mt-zh-en

53
4

paraphrase-MiniLM-L3-v2

NaNK
53
0

gte-base

53
0

distilbart-cnn-12-3

NaNK
52
0

jina-embeddings-v2-base-de

license:apache-2.0
50
3

segformer_b2_clothes

49
3

Swin2SR Realworld Sr X4 64 Bsrgan Psnr

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Upscale an image with `Xenova/swin2SR-realworld-sr-x4-64-bsrgan-psnr`. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

49
3

opus-mt-ja-en

49
1

opus-mt-nl-en

49
0

mms-tts-ara

47
0

bge-reranker-large

46
4

mDeBERTa-v3-base-xnli-multilingual-nli-2mil7

NaNK
46
2

opus-mt-en-hi

46
1

LaMini-T5-738M

45
0

mobileclip_b

45
0

slimsam-50-uniform

license:apache-2.0
44
0

nb-whisper-tiny-beta

44
0

clip-vit-large-patch14-336

NaNK
43
0

e5-large-v2

NaNK
42
5

detr-resnet-101

NaNK
42
2

e5-small

42
0

dinov2-small

42
0

llama2.c-stories110M

llama
41
5

all-distilroberta-v1

NaNK
41
0

dinov2-large

40
1

pix2struct-tiny-random

40
0

pix2struct-ai2d-base

40
0

nli-deberta-v3-base

39
2

flan-alpaca-base

39
1

depth-anything-large-hf

37
4

bloomz-560m

37
1

LaMini-Cerebras-256M

37
0

opus-mt-cs-en

36
1

distilbart-xsum-12-6

NaNK
36
1

all-roberta-large-v1

NaNK
36
0

ast-finetuned-audioset-10-10-0.4593

NaNK
base_model:MIT/ast-finetuned-audioset-10-10-0.4593
36
0

owlv2-base-patch16-finetuned

36
0

bge-large-zh-v1.5

NaNK
35
5

opus-mt-mul-en

35
1

convnext-tiny-224

NaNK
35
1

paraphrase-mpnet-base-v2

NaNK
35
0

bge-base-zh-v1.5

NaNK
34
2

swin2SR-lightweight-x2-64

NaNK
34
1

paraphrase-albert-small-v2

NaNK
34
0

opus-mt-th-en

34
0

opus-mt-en-sv

33
1

convnextv2-nano-22k-384

NaNK
33
0

LaMini-Cerebras-590M

31
2

distilbart-xsum-6-6

NaNK
31
1

LaMini-T5-223M

31
0

distilbert-base-nli-mean-tokens

31
0

DeBERTa-v3-base-mnli-fever-anli

31
0

mobilevit-x-small

30
1

long-t5-tglobal-base-16384-book-summary

30
1

mt5-base

30
0

squeezebert-uncased

30
0

llama-68m

llama
30
0

Siglip Base Patch16 512

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Zero-shot image classification w/ `Xenova/siglip-base-patch16-512`: Example: Compute text embeddings with `SiglipTextModel`. Example: Compute vision embeddings with `SiglipVisionModel`. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
29
2

wav2vec2-large-xlsr-53-gender-recognition-librispeech

29
1

bert-base-chinese

29
1

opus-mt-en-vi

29
1

bert-base-nli-mean-tokens

29
0

distilbert-base-nli-stsb-mean-tokens

29
0

bge-base-en

29
0

distilbart-xsum-12-1

NaNK
29
0

robertuito-sentiment-analysis

29
0

opus-mt-pl-en

28
1

mbart-large-50-many-to-one-mmt

28
1

texify

28
1

dpt-large

28
0

Qwen1.5-1.8B-Chat

NaNK
28
0

4x APISR GRL GAN Generator Onnx

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Upscale an image with `Xenova/4xAPISRGRLGANgenerator-onnx`. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

license:gpl-3.0
27
13

clipseg-rd64-refined

27
3

llama-160m

llama
27
2

mms-tts-fra

27
1

phi-1_5_dev

27
0

roberta-large-mnli

26
2

ClinicalBERT

26
2

bart-large-xsum

26
1

xlm-r-100langs-bert-base-nli-stsb-mean-tokens

26
0

e5-large

26
0

e5-base

26
0

distilbart-xsum-9-6

NaNK
26
0

long-t5-tglobal-base

26
0

swin2SR-compressed-sr-x4-48

NaNK
25
1

scibert_scivocab_uncased

25
0

opus-mt-en-nl

25
0

distilbart-xsum-12-3

NaNK
25
0

LaMini-GPT-124M

24
3

deberta-large-mnli-zero-cls

24
2

t5-base-grammar-correction

24
1

sponsorblock-classifier-v2

24
0

nli-mpnet-base-v2

NaNK
24
0

mobilevit-small

24
0

sentence_bert

24
0

opus-mt-en-hu

24
0

segformer-b5-finetuned-ade-640-640

NaNK
24
0

distilroberta-finetuned-financial-news-sentiment-analysis

23
5

resnet-18

NaNK
23
2

LiteLlama-460M-1T

llama
23
2

deberta-v3-base-tasksource-nli

23
1

squeezebert-mnli

23
0

indobert-base-p1

NaNK
23
0

bge-base-zh

23
0

larger_clap_general

23
0

table-transformer-detection

23
0

texify2

NaNK
22
3

opus-mt-en-cs

22
1

rubert-base-cased

22
0

mms-lid-1024

NaNK
22
0

DeBERTa-v3-xsmall-mnli-fever-anli-ling-binary

22
0

nli-deberta-v3-large

22
0

opus-mt-vi-en

22
0

wavlm-base-plus

22
0

blenderbot-400M-distill

22
0

esm2_t6_8M_UR50D_sequence_classifier_v1

NaNK
22
0

siglip-large-patch16-384

NaNK
22
0

tiny-random-ErnieMModel

22
0

WizardCoder-1B-V1.0

NaNK
21
4

mms-tts-rus

21
2

bge-large-zh

21
1

paraphrase-albert-base-v2

NaNK
21
0

bert-base-chinese-ner

21
0

multi-qa-distilbert-cos-v1

NaNK
21
0

spanbert-large-cased

21
0

spanbert-base-cased

21
0

kobert

21
0

mms-lid-2048

NaNK
21
0

DeBERTa-v3-large-mnli-fever-anli-ling-wanli

21
0

yolos-small

21
0

pythia-14m

21
0

conv-bert-base

21
0

nucleotide-transformer-500m-1000g

21
0

sam-vit-huge

license:apache-2.0
20
2

distilbert-base-cased

20
1

t5-v1_1-small

20
1

opus-mt-en-jap

20
1

bert-base-chinese-pos

20
0

roberta-base

20
0

multi-qa-mpnet-base-cos-v1

NaNK
20
0

mms-lid-126

NaNK
20
0

opt-350m

20
0

herbert-large-cased

20
0

tiny-random-Swin2SRModel

20
0

text2vec-base-chinese-sentence

20
0

instructor-large

19
2

wavlm-large

19
2

geoclip-large-patch14

license:mit
19
2

albert-base-v2

NaNK
19
1

xlm-roberta-base

19
1

nli-deberta-base

19
0

dinov2-base

19
0

tiny-random-RoFormerModel

19
0

tiny-random-ErnieModel

19
0

colbertv2.0

NaNK
18
7

UMLSBert_ENG

18
1

mms-lid-4017

NaNK
18
1

w2v-bert-2.0

NaNK
18
1

stablelm-2-zephyr-1_6b

NaNK
18
1

ernie-3.0-mini-zh

18
1

ernie-3.0-micro-zh

18
1

propositionizer-wiki-flan-t5-large

18
1

bge-large-en

18
0

yolos-base

18
0

beit-large-patch16-512

NaNK
18
0

swin2SR-classical-sr-x4-64

NaNK
18
0

conv-bert-small

18
0

flan-alpaca-large

18
0

siglip-base-patch16-384

NaNK
18
0

wav2vec2-bert-CV16-en

18
0

owlv2-base-patch16

NaNK
18
0

nanoLLaVA

license:apache-2.0
17
12

wav2vec2-base-superb-ks

17
1

deberta-v3-large-tasksource-nli

17
1

swin-tiny-patch4-window7-224

NaNK
17
1

ernie-gram-zh

17
1

text2vec-base-chinese-paraphrase

17
1

mms-lid-256

NaNK
17
0

opus-mt-ROMANCE-en

17
0

wavlm-base

17
0

opus-mt-en-ro

17
0

opus-mt-hu-en

17
0

pygmalion-350m

17
0

herbert-base-cased

17
0

long-t5-encodec-tglobal-base

17
0

electra-base-discriminator

17
0

electra-small-discriminator

17
0

nomic-embed-text-v1-unsupervised

17
0

Xenova/2x_APISR_RRDB_GAN_generator-onnx

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Upscale an image with `Xenova/2xAPISRRRDBGANgenerator-onnx`. Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

license:gpl-3.0
16
2

t5-v1_1-base

16
1

hubert-base-ls960

NaNK
16
1

ernie-3.0-nano-zh

16
1

mobilevit-xx-small

16
0

instructor-base

16
0

sentence-t5-large

16
0

deit-base-distilled-patch16-224

NaNK
16
0

pythia-70m

16
0

opus-mt-en-uk

16
0

blenderbot_small-90M

16
0

owlvit-large-patch14

NaNK
16
0

conv-bert-medium-small

16
0

deepseek-coder-1.3b-instruct

NaNK
llama
16
0

Phi-3-mini-4k-instruct-hf

16
0

TinyLLama-v0

NaNK
llama
15
2

llama2.c-stories42M

llama
15
1

opus-mt-id-en

15
1

mms-300m

15
0

opus-mt-tr-en

15
0

bge-large-zh-noinstruct

15
0

swin-base-patch4-window7-224

NaNK
15
0

swin-small-patch4-window7-224

NaNK
15
0

nb-whisper-large-beta

15
0

nucleotide-transformer-500m-human-ref

15
0

ms-marco-MiniLM-L-2-v2

NaNK
15
0

siglip-base-patch16-256

NaNK
15
0

nomic-embed-text-v1-ablated

15
0

yolov9-e

license:gpl-3.0
15
0

mms-tts-spa

14
2

mms-tts-deu

14
2

albert-large-v2

NaNK
14
0

mms-1b-fl102

NaNK
14
0

mms-1b

NaNK
14
0

tamillama_tiny_30m

llama
14
0

opt-125m

14
0

convnext-large-224-22k

14
0

convnextv2-tiny-22k-224

NaNK
14
0

mms-tts-yor

14
0

table-transformer-structure-recognition-v1.1-all

14
0

table-transformer-structure-recognition-v1.1-fin

14
0

ernie-2.0-large-en

14
0

ernie-2.0-base-en

14
0

ernie-health-zh

14
0

nougat-small

13
5

Mms Tts Hin

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Example: Generate Hindi speech with `Xenova/mms-tts-hin`. Optionally, save the audio to a wav file (Node.js): Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

13
3

Ast Finetuned Speech Commands V2

If you haven't already, you can install the Transformers.js JavaScript library from NPM using: Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using 🤗 Optimum and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).

NaNK
base_model:MIT/ast-finetuned-speech-commands-v2
13
2

opus-mt-en-ROMANCE

13
1

opus-mt-en-mul

13
1

mms-lid-512

NaNK
13
0

deit-tiny-distilled-patch16-224

NaNK
13
0

deit-base-distilled-patch16-384

NaNK
13
0

beit-base-patch16-224-pt22k-ft22k

13
0

opus-mt-uk-ru

13
0

owlvit-base-patch16

NaNK
13
0

table-transformer-structure-recognition

13
0

owlv2-base-patch16-ensemble

13
0

t5-small-awesome-text-to-sql

12
6

mms-tts-vie

12
2

convnext-xlarge-384-22k-1k

12
1

OpenELM-270M-Instruct

12
1

distilroberta-base

12
0

bloom-560m

12
0

opus-mt-en-fi

12
0

swin-large-patch4-window12-384-in22k

12
0

swin-large-patch4-window7-224

NaNK
12
0

swin-large-patch4-window12-384

NaNK
12
0

ipt-350m

12
0

beit-large-patch16-384

NaNK
12
0

opus-mt-ru-uk

12
0

opus-mt-uk-en

12
0

opus-mt-es-de

12
0

opus-mt-fr-es

12
0

opus-mt-de-es

12
0

opus-mt-en-af

12
0

convnext-base-224

NaNK
12
0

convnext-large-224

NaNK
12
0

convnext-large-384-22k-1k

12
0

dinov2-base-imagenet1k-1-layer

12
0

dinov2-small-imagenet1k-1-layer

12
0

fastvit_t8.apple_dist_in1k

12
0

fastvit_sa36.apple_dist_in1k

12
0

mbart-large-50

NaNK
11
2

siglip-large-patch16-256

NaNK
11
2

resnet-152

NaNK
11
1

bert-base-chinese-ws

11
0

dino-vitb16

NaNK
11
0

deeplabv3-mobilevit-small

11
0

opus-mt-fi-en

11
0

opus-mt-gmw-gmw

11
0

opus-mt-en-da

11
0

swin-base-patch4-window12-384

NaNK
11
0

gpt-neo-romanian-125m

11
0

pythia-70m-deduped

11
0

resnet-26

NaNK
11
0

beit-base-patch16-224

NaNK
11
0

dit-large-finetuned-rvlcdip

11
0

opus-mt-ro-fr

11
0

opus-mt-ru-fr

11
0

opus-mt-xh-en

11
0

opus-mt-es-fr

11
0

opus-mt-es-ru

11
0

opus-mt-de-fr

11
0

LaMini-Cerebras-111M

11
0

tiny-random-mistral

11
0

convnext-small-224

NaNK
11
0

convnext-base-224-22k

11
0

convnext-base-384

NaNK
11
0

convnext-base-384-22k-1k

11
0

convnext-large-224-22k-1k

11
0

convnext-large-384

NaNK
11
0

convnext-xlarge-224-22k

11
0

convnext-xlarge-224-22k-1k

11
0

convnextv2-atto-1k-224

NaNK
11
0

convnextv2-base-22k-224

NaNK
11
0

convnextv2-base-22k-384

NaNK
11
0

dinov2-large-imagenet1k-1-layer

11
0

segformer-b1-finetuned-cityscapes-1024-1024

NaNK
11
0

fastvit_ma36.apple_dist_in1k

11
0

fastvit_sa12.apple_in1k

11
0

nougat-base

10
3

chinese-clip-vit-large-patch14-336px

10
3

wav2vec2-large-xlsr-53-english

10
2

mms-tts-por

10
2

opus-mt-jap-en

10
1

vitmatte-small-composition-1k

10
1

fastvit_s12.apple_dist_in1k

10
1

distilgpt2_onnx-quantized

10
0

LaMini-Neo-125M

10
0

dino-vitb8

NaNK
10
0

opus-mt-tc-big-tr-en

10
0

opus-mt-gem-gem

10
0

opus-mt-it-fr

10
0

opus-mt-fr-de

10
0

deit-small-distilled-patch16-224

NaNK
10
0

pythia-160m

10
0

resnet-34

NaNK
10
0

resnet-101

NaNK
10
0

donut-base-finetuned-cord-v2

NaNK
10
0