morningstar-14b
75
—
by
kurdman991
Other
OTHER
14B params
New
75 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
32GB+ RAM
Mobile
Laptop
Server
Quick Summary
AI model with specialized capabilities.
Device Compatibility
Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
14GB+ RAM
Code Examples
Quick Startbash
git clone https://github.com/morningstarnasser/MORNINGSTAR-Vision-AI.git
cd MORNINGSTAR-Vision-AI
chmod +x setup_and_benchmark.sh
./setup_and_benchmark.shManual Setupbashollama
# Install Ollama (if not installed)
curl -fsSL https://ollama.com/install.sh | sh
# Clone & build
git clone https://github.com/morningstarnasser/MORNINGSTAR-Vision-AI.git
cd MORNINGSTAR-Vision-AI
# Create models (choose what you need)
ollama create morningstar -f Modelfile # 14B — Fast & powerful
ollama create morningstar-32b -f Modelfile.32b # 32B — Maximum quality
ollama create morningstar-vision -f Modelfile.vision # Vision — See images
# Run
ollama run morningstarDeploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.