LFM2-1.2B-GGUF

13.7K
98
1.2B
8 languages
Q4
llama.cpp
by
LiquidAI
Language Model
OTHER
1.2B params
Fair
14K downloads
Community-tested
Edge AI:
Mobile
Laptop
Server
3GB+ RAM
Mobile
Laptop
Server
Quick Summary

LFM2 is a new generation of hybrid models developed by Liquid AI, specifically designed for edge AI and on-device deployment.

Device Compatibility

Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
2GB+ RAM

Code Examples

🏃 How to run LFM2text
llama-cli -hf LiquidAI/LFM2-1.2B-GGUF

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.