MERaLiON-2-3B-DPO-CodeSwitch
152
—
by
myaccountfor
Audio Model
OTHER
3B params
New
152 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
7GB+ RAM
Mobile
Laptop
Server
Quick Summary
AI model with specialized capabilities.
Device Compatibility
Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
3GB+ RAM
Code Examples
Usagepythonpytorch
from meralion2_model.modeling_meralion2 import MERaLiON2ForConditionalGeneration
from meralion2_model.processing_meralion2 import MERaLiON2Processor
import torch
import librosa
# Load model
model = MERaLiON2ForConditionalGeneration.from_pretrained(
"myaccountfor/MERaLiON-2-3B-DPO-CodeSwitch",
torch_dtype=torch.bfloat16,
device_map="auto",
trust_remote_code=True
)
processor = MERaLiON2Processor.from_pretrained(
"myaccountfor/MERaLiON-2-3B-DPO-CodeSwitch",
trust_remote_code=True
)
model.eval()
# Load audio
audio, sr = librosa.load("path/to/audio.wav", sr=16000)
# Build prompt
prompt = "Please transcribe this speech."
input_text = f"Instruction: {prompt} \nFollow the text instruction based on the following audio: <SpeechHere>"
# Apply chat template
conversation = [{"role": "user", "content": input_text}]
chat_prompt = "<bos><start_of_turn>user\n" + input_text + "<end_of_turn>\n<start_of_turn>model\n"
# Process inputs
inputs = processor(text=[chat_prompt], audios=[audio])
inputs = {k: v.to(model.device) for k, v in inputs.items()}
# Generate
with torch.no_grad():
outputs = model.generate(
**inputs,
max_new_tokens=256,
do_sample=False,
use_cache=True,
)
generated_ids = outputs[:, inputs['input_ids'].size(1):]
transcription = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
print(transcription)Deploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.