prince-canuma

28 models • 1 total models in database
Sort by:

Kokoro-82M

license:apache-2.0
492
4

Ministral-8B-Instruct-2410-HF

Ministral-8B-Instruct-2410-HF is the Hugging Face version of Ministral-8B-Instruct-2410 by Mistral AI. It is a multilingual instruction-tuned language model based on the Mistral architecture, designed for various natural language processing tasks with a focus on chat-based interactions. Here's a Python script demonstrating how to use the model for chat completion: - Developed by: Mistral AI - Model type: Causal Language Model - Language(s): English - License: mrl - Resources for more information: - Model Repository - Mistral AI GitHub

NaNK
21
11

Damysus-2.7B-Chat-GGUF

NaNK
license:mit
14
0

bark-small

license:mit
12
0

Spark-TTS-0.5B

NaNK
license:cc-by-nc-sa-4.0
9
3

Kokoro-82M-4bit

NaNK
license:apache-2.0
5
0

Kokoro-82M-3bit

NaNK
license:apache-2.0
5
0

Kokoro-82M-8bit

NaNK
license:apache-2.0
4
1

test-cpu

4
0

Llama-3-6B-v0.1

NaNK
llama
3
14

c4ai-command-r-v01-4bit

NaNK
3
4

Kokoro-82M-6bit

NaNK
license:apache-2.0
3
0

Mistral-Small-3.1-24B-Instruct-2503

NaNK
license:apache-2.0
3
0

Damysus-2.7B-Chat

NaNK
license:mit
2
4

Florence-2-large-ft

license:mit
2
0

deepseek-vl2-small

2
0

Phi-4-reasoning-Plus-6bit

NaNK
license:mit
2
0

deepseek-vl2

1
2

Florence-2-base-ft

license:mit
1
1

Meta-Llama-3-8B-Instruct-AWQ

NaNK
llama
1
0

Llama-3.1-250M

llama
1
0

nanoVLM

nanoVLM is a minimal and lightweight Vision-Language Model (VLM) designed for efficient training and experimentation. Built using pure PyTorch, the entire model architecture and training logic fits within ~750 lines of code. It combines a ViT-based image encoder (SigLIP-B/16-224-85M) with a lightweight causal language model (SmolLM2-135M), resulting in a compact 222M parameter model. For more information, check out the base model on https://huggingface.co/lusxvr/nanoVLM-222M. Clone the nanoVLM repository: https://github.com/huggingface/nanoVLM. Follow the install instructions and run the following code:

license:mit
1
0

WizardLM-2-8x22B

NaNK
license:apache-2.0
0
5

Llama-3-6B-v0

NaNK
llama
0
4

Mixtral-8x22B-v0.1-4bit

NaNK
license:apache-2.0
0
2

babyLlama

llama
0
1

c4ai-command-r-v01-tokenizer-chat-template

0
1

Meta-Llama-3-8B-bnb-4bit

NaNK
llama
0
1