webnn
13 models • 2 total models in database
Sort by:
efficientnet-lite4
license:apache-2.0
389
0
mobilenet-v2
license:apache-2.0
320
1
yolo12n
license:agpl-3.0
23
1
yolov8m
license:agpl-3.0
21
0
yolo11n
license:agpl-3.0
17
0
modnet
—
14
0
yolov8n
license:agpl-3.0
12
0
Phi-4-mini-instruct-onnx-transformers_js
Based on https://huggingface.co/microsoft/Phi-4-mini-instruct Convert ONNX model by using https://github.com/microsoft/onnxruntime-genai Using command: python -m onnxruntimegenai.models.builder -m microsoft/Phi-4-mini-instruct -o Phi-4-mini-instruct-onnx -e webgpu -c cache-dir -p int4 --extraoptions int4blocksize=32 int4accuracylevel=4 The generated external data (modelq4f16.onnxdata) is larger than 2GB, which is not suitable for ORT-Web. I use an additional Python script to move some data into model.onnx.
license:mit
3
1
gelan-c_all
license:apache-2.0
3
0
Depth Anything V2 Small 504
—
2
1
ssd-mobilenet-v1
—
1
1
Depth Anything V2 Small 518
—
1
1
mobilenet-v2-12
license:apache-2.0
1
0