KORMo 19B MoE

34
6
19.0B
1 language
license:apache-2.0
by
LDCC
Language Model
OTHER
19B params
New
34 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
43GB+ RAM
Mobile
Laptop
Server
Quick Summary

본 모델은 KORMo-Team/KORMo-10B-sft를 기반으로, 두 개의 Expert를 갖는 Mixture of Experts (MoE) 구조로 확장한 버전입니다.

Device Compatibility

Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
18GB+ RAM

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.