Kimi-K2-Instruct-0905-mlx-DQ3_K_M
414
7
1 language
—
by
mlx-community
Language Model
OTHER
2505.02390B params
New
414 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
5600GB+ RAM
Mobile
Laptop
Server
Quick Summary
This model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3KM was converted to MLX format from moonshotai/Kimi-K2-Instruct-0905 using mlx-lm version 0.
Device Compatibility
Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
2333GB+ RAM
Code Examples
bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"bash
pip install mlx-lm
mlx_lm.generate --model mlx-community/Kimi-K2-Instruct-0905-mlx-DQ3_K_M --temp 0.6 --min-p 0.01 --max-tokens 4096 --trust-remote-code --prompt "Hallo"Deploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.